$15 billion. That’s Amazon’s AI revenue run rate as of Q1 2026, and CEO Andy Jassy just dropped a bombshell that should make every cloud competitor nervous: Amazon might start selling its custom AI chips to third parties.
Let me be clear about what this means for anyone building AI tools or evaluating infrastructure options. Amazon isn’t just keeping its Trainium and Inferentia chips for internal use anymore. Jassy says demand is so high that selling “racks of them” to outside customers is now on the table.
This is a fascinating pivot, and frankly, a smart one. Amazon has spent years developing these chips specifically to reduce its dependence on Nvidia’s GPUs. Now they’re considering turning that investment into a direct revenue stream by competing with the very companies they originally built these chips to avoid paying.
What This Actually Means for AI Builders
If you’re running AI workloads today, you’ve probably felt the pain of GPU scarcity and pricing. Nvidia has dominated this space so completely that alternatives have been hard to come by. AMD has been trying to gain ground, but adoption has been slow.
Amazon entering the merchant chip market changes the calculation. Their chips are already proven at massive scale within AWS. If they start selling hardware directly, it creates a third option for companies that want to own their infrastructure rather than rent it through cloud services.
But here’s where it gets interesting from a toolkit evaluation perspective: Amazon would be selling chips to companies that might compete with AWS itself. That’s a weird position to be in. Google has done something similar with its TPUs, offering them both as cloud services and discussing external sales, so there’s precedent. But Amazon’s scale is different.
The Nvidia Problem
Nvidia’s dominance in AI chips has been both a blessing and a curse for the industry. Their CUDA ecosystem is deeply entrenched, which means switching costs are real. Developers have built entire toolchains around Nvidia’s architecture.
Amazon’s chips require different optimization approaches. They’re not drop-in replacements. This matters because if you’re evaluating whether to bet on Amazon’s hardware, you need to factor in migration costs and the maturity of the software ecosystem.
The question I keep asking: will Amazon’s chips be good enough to justify the switching costs? For AWS customers, the answer might be yes, especially if pricing is aggressive. For companies buying racks of chips to run on-premises, the calculation is more complex.
What I’m Watching
The AI infrastructure space is moving fast, and this announcement adds another variable to an already complicated decision matrix. Here’s what matters for anyone evaluating AI tooling and infrastructure:
- Pricing strategy: Will Amazon undercut Nvidia and AMD significantly, or just offer comparable pricing with better availability?
- Software ecosystem: How mature are the development tools and frameworks for Amazon’s chips outside of AWS?
- Performance benchmarks: Real-world performance data from third-party users, not just Amazon’s internal metrics.
- Support model: What does support look like for companies running these chips outside of AWS infrastructure?
Amazon’s aggressive AI spending has worried investors this year, with shares struggling as questions mount about returns on investment. Selling chips to third parties could help justify those expenditures by creating a new revenue stream beyond cloud services.
For toolkit reviewers like me, this development means we’ll need to start testing and benchmarking Amazon’s chips as standalone hardware options, not just as AWS services. That’s a significant shift in how we evaluate AI infrastructure.
The chip market just got more competitive, and that’s good news for anyone building AI products. More options mean better pricing and more innovation. But it also means more complexity in choosing the right foundation for your AI stack. Amazon’s move raises the stakes for everyone involved, and the next few quarters will show us whether they can execute on this vision.
đź•’ Published: