AWS CEO Matt Garman said Amazon’s recent $50 billion investment in OpenAI, after years of partnership including an $8 billion investment in Anthropic, is the kind of conflict of interest the cloud giant is used to dealing with.
Garman has worked at Amazon since interning at business school in 2005, before launching AWS in 2006, he told an audience at this week’s HumanX conference in San Francisco.
When asked about the conflict inherent in working closely with two AI modeling companies that are fierce (and perhaps at times petty) competitors, he said it’s not an issue. He explained that AWS itself often competes with partners, so it has a lot of first-hand experience in such competition.
In the early days of AWS, the company recognized that it couldn’t build all its cloud products on its own, so it partnered with other departments.
“We also knew we would have to compete with our partners because technology is so interconnected,” says Garman. “That’s why we’ve been strengthening the way we go to market with our partners for a very long time,” he continued. “But we may have our own products that compete with them, and that’s OK. We promised them that we would not give ourselves an unfair competitive advantage.”
Today, the world is used to Amazon competing with companies that sell on the cloud. Even Oracle, one of AWS’ biggest competitors, sells databases and other services on AWS. But this was a radical idea back in 2006, when technology partners were struggling to never compete with the partners who helped them succeed.
Still, Amazon is far from a pioneer in abandoning its commitment to investor loyalty and conflicts of interest in the world of wild, money-grabbing AI. When Anthropic announced its latest $30 billion funding in February, it included at least 12 investors backing OpenAI. This included OpenAI’s primary cloud partner, Microsoft.
tech crunch event
San Francisco, California
|
October 13-15, 2026
For AWS, investing heavily in OpenAI and gaining a customer (and technology development partner) model was almost a matter of life and death. Both models were already available in the cloud from Microsoft, AWS’s biggest rival.
The cloud giant is also working to put itself front and center by offering AI model routing services. These services allow customers to automatically use different models for different tasks as a way to maximize performance and reduce costs. As Garman explained, one model is best for planning, another for inference, and a cheaper model best for simple tasks like code completion. “I think that’s where the world is headed,” Garman said.
This is also how Amazon, and even Microsoft for that matter, use their homegrown model, again creating a competitive situation with their former partners.
All is fair in love and AI these days.
