The Maia and Cobalt custom chips will be available within Microsoft’s cloud infrastructure.
Microsoft will make two new chips available next year, the tech giant announced at the Microsoft Ignite conference on Nov. 15. The Microsoft Azure Maia 100 is designed for AI workloads, and the Microsoft Cobalt 100 CPU is designed for general compute workloads on Microsoft cloud.
Custom chips built to in-house specifications
Both the Maia 100 and Cobalt 100 chips are built in-house by Microsoft, which the tech giant says enables “everything from silicon choices, software and servers to racks and cooling systems” to be tailored to the customer workloads Microsoft predicts it will see, according to a press release.
The Microsoft Azure Maia 100 AI Accelerator is optimized for AI tasks and generative AI (Figure A). Microsoft shared their designs for the Maia 100 with OpenAI to ensure the Maia 100 would be optimized for large language workloads.
The Microsoft Cobalt 100 CPU is an Arm-based processor designed for Microsoft Cloud (Figure B).
“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centers to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group, in the press release. “At the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain and give customers infrastructure choice.”
Sidekick server racks offer liquid cooling
To make room in data centers for the Microsoft Maia 100 AI Accelerator, Microsoft developed custom server racks. These sidekick racks are wider than the typical Microsoft server format and sit next to the Microsoft Maia 100 rack.
Liquid cooling fluid runs from the sidekick to the Maia 100 rack and back, creating a cooler environment. The custom racks could be used with silicon from industry partners.
New chips were designed for cloud workloads
Microsoft expects customers to use the new chips for AI and cloud computing from Microsoft’s data centers, including running Microsoft Copilot or Azure OpenAI Service. The Maia 100 and Cobalt 100 chips are made for custom racks within Microsoft data centers.
SEE: Microsoft adds a Copilot preview to Azure.
Microsoft has been steadily working on producing more and more of the components of the Microsoft Cloud itself. The silicon was the final piece of the puzzle.
“We have visibility into the entire stack, and silicon is just one of the ingredients,” said Rani Borkar, corporate vice president for Azure Hardware Systems and Infrastructure at Microsoft, in the press release.
Microsoft’s relationship to its silicon competitors
Offering in-house chips prevents Microsoft from having to rely on competitors to run large AI workloads. The Maia chip in particular could compete with NVIDIA’s AI-focused GPUs. AMD, Arm, AWS, Intel, Meta, Google, SambaNova and Qualcomm also all produce chips meant for AI workloads.
SEE: NVIDIA revealed new chips for AI and high-performance computing workloads.
Borkar told The Verge he doesn’t see the AI chip landscape as competition, but rather that Microsoft’s chips can be “complementary” to its partnerships, including those with other companies in the AI chip space.
“All the things we build, whether infrastructure or software or firmware, we can leverage whether we deploy our chips or those from our industry partners,” said Pat Stemen, partner program manager on the AHSI team, in the press release. “This is a choice the customer gets to make, and we’re trying to provide the best set of options for them, whether it’s for performance or cost or any other dimension they care about.”
Microsoft is not planning on replacing any existing hardware from AMD, Intel or NVIDIA. Instead, the company frames the first-party silicon option as giving customers more choices.
Microsoft plans to produce second-generation versions of both the Maia and Cobalt chips at an unspecified time in the future.