A custom-built rack for the Maia 100 AI Accelerator and its "sidekick" inside a thermal chamber at a Microsoft lab in Redmond, Washington.
The first, its Maia 100 artificial intelligence chip, could compete with Nvidia's highly sought-after AI graphics processing units.
In addition to designing the Maia chip, Microsoft has devised custom liquid-cooled hardware called Sidekicks that fit in racks right next to racks containing Maia servers.
Moving from GPUs to AWS Trainium AI chips can be more complicated than migrating from Intel Xeons to Gravitons, though.
WATCH: Nvidia notches tenth straight day of gains, driven by new AI chip announcement
Persons:
Maia, Nvidia's, Rani Borkar, Borkar, OpenAI, OpenAI's, We've, they've, Colette Kress, Steve Tuck, Tuck, Graviton, Dave Brown, Brown
Organizations:
Microsoft, Intel, Google, CNBC, Amazon, Services, Nvidia, AMD, Computer, Companies, SQL, AWS
Locations:
Redmond , Washington, Seattle, New York, Gravitons