Kevin Scott, Microsoft’s Chief Technology Officer and Executive Vice President, will speak at the Microsoft Briefing Event at the Seattle Convention Center Summitville in Seattle, Washington, on May 21, 2024.
Jason Redmond | AFP | Getty Images
Microsoft Tech Giant’s chief technology officer on Wednesday said he would like to use his own chips, primarily in data centers, in a move that could reduce reliance on key players. nvidia and AMD.
Semiconductors and servers located within the data center support the development of artificial intelligence models and applications.
Tech Giant Nvidia has dominated space so far with its graphics processing units (GPUs), but rival AMD has smaller slices of pie.
However, major cloud computing players, including Microsoft, have designed their own custom chips, especially for data centers.
Microsoft’s Chief Technology Officer Kevin Scott laid out the company’s strategy on AI chips during a fireplace chat at Tech Week in Italy, hosted by CNBC.
Microsoft mainly uses Nvidia and AMD chips in its own data centers. The focus is on choosing the right silicon (another abbreviation term for semiconductors) that offers “best price performance” per chip.
“We’re not religious about what a tip is, and… that means that the best price performance solution is Nvidia for years,” Scott said. “We will literally entertain anything to ensure we are capable of meeting this demand.”
At the same time, Microsoft uses some of its own chips.
In 2023, Microsoft launched the Azure Maia AI accelerator designed for AI workloads and Cobalt CPUs. Additionally, the company is reportedly working on next-generation semiconductor products. Last week, the US technology giant unveiled a new cooling technology using “microfluids” to solve the problem of overheating the chip.
When asked whether long-term plans would primarily place Microsoft Chips in the company’s own data centers, Scott said:
The focus on chips is part of a strategy to design an entire system that will ultimately enter the data center, Scott said.
“It’s about the design of the whole system. It’s networking and cooling, and we hope we can freely make the decisions necessary to really optimize the calculations to our workload,” Scott said.
Microsoft and its rivals Google and Amazon We design our own chips to not only reduce our dependence on Nvidia and AMD, but also make our products more efficient for specific requirements.
Calculate the insufficient capacity
Tech giants, including Meta, Amazon, Alphabet and Microsoft, are committed to capital spending of more than $300 billion this year, focusing on AI investments to meet AI demand.
Scott flagged the fact that he still lacked computing power.
“(a) a massive crunch (in the calculation) is probably an understatement,” Scott said. “I think we’ve been in a mode where it’s almost impossible to build capacity quickly enough since it was launched from ChatGpt.”
Microsoft is building capacity through data centers, but that’s still not enough to meet demand, the CTO warned.
“Even our most ambitious forecasts have been found to be insufficient on a regular basis. And… we have deployed incredible capacity over the past year and even more capacity over the next few years,” Scott said.
