A.I. chip, Maia 200, calling it “the most efficient inference system” the company has ever built. The Satya Nadella -led tech ...
Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Discover 10 top online IT certifications that boost tech job prospects and supercharge your tech career training with ...
TL;DR: Get Windows 11 Pro and the Essential Windows 11 Pro Training Course bundled together for only $24.99 (reg. $237).
See how three organizations strengthen security and accelerate AI innovation with Microsoft’s family of security products.
Microsoft has made $37.5bn in capital expenditures tied to the build-out, more than analysts expected, which is why investors keep watching the gap between Azure growth and the bill for new concrete.
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
Cloud computing is rarely front-facing, but at the same time, it underpins a lot of data-intensive and scalable digital ...
Perplexity AI has signed a $750 million deal with Microsoft to access a wider range of frontier models from OpenAI and xAI via Microsoft Foundry.
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
The Maia 200 deployment demonstrates that custom silicon has matured from experimental capability to production ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results