On Wednesday, xAI and Anthropic announced a surprise partnership in which Claudemaker will buy “all of the computing power in[xAI’s]Colossus 1 data center” (approximately 300 MW), allowing Anthropic to immediately increase usage limits. This is a big deal for xAI, likely worth billions of dollars. More importantly, it immediately monetized one of the company’s most impressive achievements: transforming xAI from a consumer to a provider of computing.
Amid ongoing litigation, it’s tempting to view this deal as an attack on OpenAI. But Musk’s explanation for X was that xAI had already moved its training to a new data center, Colossus 2, and xAI simply didn’t need both.
In the short term, there is an obvious logic at work. xAI’s existing products are primarily focused on Grok, but its usage has plummeted since the image generation failure earlier this year. If xAI’s data center expansion far exceeds what Grok needs to operate, partnering with Anthropic will add a lot of green to its balance sheet. This is especially helpful as the company, which merged with SpaceX, accelerates towards an IPO. More broadly, having Anthropic listed as a customer makes it easier to believe that SpaceX’s orbital data center plan might actually work.
But beyond short-term profits, Human Partnership sends an unusual message about where Elon Musk’s priorities actually lie. This suggests that the company’s real business may lie in building data centers rather than training AI models.
It’s unusual for a major technology company to treat its computing resources this way, at a time when companies like Google and Meta are also training models and building more data centers. This can be easily overlooked, as many of these companies simultaneously act as enterprise AI vendors, online services, and cloud providers. But if faced with a choice between selling more available computing to a customer and saving some to build their own tools, the customer will surely choose door number two.
Just last month, Sundar Pichai acknowledged on a conference call that Google Cloud’s revenue was lower than it should have been due to the company’s “capacity constraints.” And when given the choice between renting GPUs and using them to develop AI products, Google chose AI products.
Facebook faces a more extreme version of the same constraint, launching an entirely new cloud apparatus just to ensure it has enough GPU power to pursue Mark Zuckerberg’s AI ambitions. As he said when announcing Meta Computing in January, “The strategic advantage lies in how we design, invest, and partner to build this infrastructure.”
tech crunch event
San Francisco, California
|
October 13-15, 2026
The keyword here is “strategy”. Both Zuckerberg and Pichai are looking toward a future where AI powers some of the world’s most popular and profitable systems. Computing power is not only the means to meet today’s inference demands, but also to build tomorrow’s products. Lack of computing power means missed opportunities.
By focusing on data centers (like Earthbound), xAI positions itself more like a neo-cloud business, buying GPUs from NVIDIA and renting them to model developers like Anthropic. This is a much more difficult business, squeezed by both chip suppliers and changing demand cycles. The valuations of the most active neoclouds reflect that reality. xAI was valued at $230 billion in a January funding round. CoreWeave, which manages equivalent computing power, is worth less than a third of that.
As you might expect, Musk’s version of Neocloud is more ambitious. Some data centers could be in space by at least 2035 if all goes as planned. xAI will manufacture its own chips at Terafab, which will take away some, but not all, of Nvidia’s pricing power. But none of that changes the fundamental economics of neocloud business.
As recently as February’s all-hands-on-deck, xAI had real ambitions for software. This was a presentation that revealed the Orbital Data Center project, but also hinted at some interesting ideas, such as big ambitions in coding (since strengthened by our partnership with Cursor) and leveraging the use of computers for full-fledged digital twins (unfortunately a project in the Macrohard project). These are long-term projects that require dedicated computing resources to succeed. As long as xAI sells massive amounts of computing to competitors, it’s hard to see any future in such new ambitions.
If you buy through links in our articles, we may earn a small commission. This does not affect editorial independence.
