Chipmaker stocks have soared over the past month, with Micron up 80%, SanDisk up 52% and Intel up 85%, to name just a few of the companies participating in the rally. Behind this proliferation is an evolution in AI’s system architecture known as “orchestration,” where workloads are distributed through multiple processing channels rather than concentrated into larger, centralized blocks. Orchestration requires more traditional central processing units (CPUs) compared to more powerful graphics processing units (GPUs), which fueled Nvidia’s rise in the first phase of building AI. While GPUs will continue to be essential for core AI tasks such as model training and query answering, Wall Street believes that an orchestration-driven share of chip demand will continue as AI software becomes more “agent-like,” meaning better able to handle more generalized instructions. MU NVDA 1Y Mountain Micron vs. Nvidia, 1 Year “We believe Agent AI increases the mix of CPUs and GPUs in AI systems by adding more orchestration, memory, and tooling work,” Morgan Stanley analyst Sean Kim and colleagues wrote in a note to investors on Monday. “While this shouldn’t reduce demand for GPUs, it does increase overall system complexity and redirects incremental infrastructure spending to CPUs, networking, and memory.” A new buzzword, tech companies are saying similar things about orchestration, emphasizing coordination and adaptability within the infrastructure rather than the chip architecture itself as a way to improve the computing power of AI. “There is no single-chip architecture that can efficiently handle every workload,” Meta said in an April statement announcing the use/renting of “tens of millions” of Graviton CPUs from Amazon’s cloud infrastructure subsidiary. “As Meta advances to work with agent AI, the computing requirements will evolve and require more CPUs.” Chipmaker AMD also emphasized CPUs from an orchestration perspective as part of the deal it announced with Meta in February. According to news agency Reuters, the $60 billion deal stipulated that Meta would buy 6 gigawatts worth of chips over five years, while also allowing Meta to acquire up to 10% of AMD’s stock. “As AI infrastructure grows in size and complexity, CPUs will become a strategic pillar of the AI computing stack, enabling efficiency, scalability, and orchestration alongside GPUs,” AMD said in a statement. Cybersecurity link orchestration has already proven to be a cost-effective way to enhance AI capabilities. Anthropic’s announcement of Mythos last month shocked the cybersecurity world and prompted the company to restrict access to Mythos, but multiple research organizations say they were able to reproduce the results by tweaking less advanced, publicly available models. “We used GPT-5.4 and Claude Opus 4.6 in open code with a standardized chunked security review workflow to try to reproduce Anthropic’s patched public sample outside of Anthropic’s internal stack,” the Vidoc Security Lab researchers said, calling the results “more useful.” “The point is not whether Mythos is better or more powerful, but that public models can already achieve nearly the same results,” they said. Cybersecurity company Aisle said it has done something similar, using a smaller, cheaper, tailored model to isolate the same security bug. “The small-scale model has already delivered sufficient improvements, and when wrapped in expert orchestration… produces results that the ecosystem takes seriously,” the company wrote. One industry consultant told CNBC that equating AI computing power with GPUs amounts to a “misconception” in the market. “The misconception in the market is that if you’re going to run AI, you have to run it on a GPU. That’s not the case. I don’t know why people started believing that. Maybe it’s Nvidia, marketing, etc.,” said David Linthicome, former chief cloud officer at Deloitte. “When we train architects, we always tell them to use the minimum viable technology, and that’s as much CPU as possible.” Other Beneficiaries Other parts of the data center supply chain are also benefiting from the pivot to orchestration, particularly the interstitial segments that connect different processing channels. These are areas such as memory systems such as DRAM and NAND, as well as electronic design automation, baseboard management controls, and substrates. Prominent downstream companies mentioned in Morgan Stanley’s May 11 memo on “agent” AI include major memory manufacturers such as Samsung, SK Hynix, Micron, SanDisk, and Kioxia, as well as KLA Corporation, Cadence Design Systems, and Taiwan-based Gold Circuit Electronics.
