Sign up for our newsletters to get the freshest AI news and exclusive insider info straight to your inbox.
Big news from Microsoft: they’re beefing up their tech game with two new chips they’ve made all by themselves, called Azure Maia 100 and Azure Cobalt 100.
These chips got their moment in the spotlight at the Microsoft Ignite 2023 event in Seattle, which is the biggest annual shindig they throw globally. They’re a big deal for businesses because they offer up smart, scaleable, and eco-friendly computing to really get the most out of the latest cloud and AI magic.
Microsoft is pretty proud of these chips. They say it’s like finding the missing piece for their goal to have a super-flexible tech infrastructure, using their own and their partners’ hardware and software. It’s all about making sure they can handle any kind of tech task thrown their way.
Let’s break it down: Maia is like the brain for AI, perfect for handling big AI jobs in the cloud, like training and running those fancy generative AI apps. Cobalt, on the other hand, is the multitasker for everyday computing jobs, and it does it without breaking a sweat. Both chips are getting slotted into Azure next year, and will first be put to work in Microsoft’s own data centers to beef up their Copilot and Azure OpenAI services.
To keep on top of things, you can get into VB Transform On-Demand, although seats for the live event next year are already gone.
Scott Guthrie, one hotshot executive at Microsoft, mentioned that they’re basically doing a total makeover of their data centers to impress their customers. With the size of operations they’re running, it’s key to make sure every bit of their tech is fine-tuned. This helps to do a few things: it means better performance, options for customers, and it doesn’t put all their eggs in one basket when it comes to their suppliers.
So, what kind of secret sauce are Azure Maia and Cobalt packing? Microsoft hasn’t spilled all the beans yet, but they’ve hinted that the Maia AI chip is a powerhouse for some of the heaviest AI tasks on Azure. It’s custom-made for Azure’s own hardware to really max out its potential.
Turns out, the chip is something they’ve been working on for some time with help from OpenAI. They’ve been tweaking it based on feedback from their work with models created by the OpenAI team, with the goal of making AI models that can do more but cost less for their customers.
Cobalt’s story is a bit more hush-hush but one thing’s for sure: it’s all about handling a variety of jobs on Azure without chugging too much power. Its build is focused on doing a lot with a little, energy-wise, which means data centers can do more without jacking up the electricity bill.
Putting the spotlight on the way they’re built, Microsoft is custom-making server boards and racks especially for these chips, and they’ve come up with a clever cooling system or “sidekick” that helps keep the chips chilled even when they’re working overtime.
They’re not just patting their own back though; Microsoft is also giving a boost to partner hardware. They’re rolling out new virtual machine series for some of Nvidia’s fancy GPUs and adding more AMD power to their line-up soon. The cool thing is this means businesses can pick what’s best for them, whether they’re looking at power or price.
All in all, the rollout of these chips is set to start next year, and Microsoft is already busy cooking up the next generation.