Amazon the chip company? Tech giant says it may sell AI chips as a product, not just a cloud service [Business Insider]
It was only a matter of time before the biggest player in cloud computing decided to try selling the actual shovels, not just the mining rights. Amazon is reportedly weighing a move that would fundamentally shift its hardware strategy: selling its own artificial intelligence chips directly to businesses, rather than keeping them locked inside the Amazon Web Services (AWS) data centers.
For years, the conventional wisdom in Silicon Valley was that Amazon designed custom silicon—chips like the Trainium and Inferentia processors—to make its own cloud service cheaper and faster. The chips were a secret weapon, a way to offer customers better performance per dollar for training large language models or running inference. But according to multiple reports and signals from the company’s recent public statements, Amazon may now rip up that playbook and turn its chip division into a standalone product line.
The "secret weapon" goes on the market
The logic here is surprisingly straightforward. AWS generates the vast majority of Amazon’s operating income. It is a money-printing machine built on renting compute power. So why would Amazon want to sell a chip that could end up in a competitor’s data center? Because the AI chip market is currently a two-horse race between Nvidia and a handful of challengers, and Amazon sees a massive opportunity to capture value from the entire ecosystem—not just the part that rents servers.
Sources close to the matter indicate that Amazon is exploring multiple go-to-market paths. The most likely scenario involves selling chips directly to large enterprises, telcos, and even other cloud providers. Think of it as a wholesale hardware play. A company could buy a rack of Amazon’s Trainium chips, install them in their own facility, and run AI workloads without ever touching AWS. That would directly compete with Nvidia’s data center GPUs, but it would also cannibalize some of AWS’s own rental business.
But Amazon might be betting that the cannibalization is worth it. The AI chip market is expected to grow to well over $100 billion annually within the next few years. If Amazon can capture just 5% of that as a hardware vendor, it adds billions in revenue with much higher margins than retail. More importantly, it gives the company a lock on the AI supply chain that goes far beyond cloud subscriptions.
Why now? The Nvidia chokehold
The timing is no accident. Nvidia’s H100 and B200 GPUs are incredibly powerful, but they are also incredibly expensive and incredibly scarce. Companies that don’t want to build their entire AI strategy on one vendor are desperate for alternatives. Amazon’s Inferentia and Trainium chips are already proven in production—companies like Airbnb, Snap, and Pinterest use them inside AWS. The performance is competitive, and the price is often dramatically lower than Nvidia’s offerings.
By selling these chips directly, Amazon gives those companies a way to diversify their supply chain without shifting their entire infrastructure to the cloud. It also allows Amazon to compete with Microsoft and Google on a different axis. Microsoft is cozy with Nvidia through Azure and also builds its own Maia chips. Google has its Tensor Processing Units (TPUs) but keeps them largely internal. Amazon would be the first of the Big Three to try to sell its custom silicon as a standalone product in the open market.
There is a precedent here. Amazon has slowly been opening up its logistics network to third-party sellers, effectively competing with FedEx and UPS while still running its own delivery service. The company has never been afraid to compete with its own customers if the economics make sense. Selling AI chips would be the exact same playbook: build a best-in-class technology for internal use, then monetize it externally when the market demands it.
The challenges ahead
This strategy is not without risk. Selling hardware is a very different business from selling cloud services. You need a sales force that understands chip architectures, supply chain logistics for silicon, and long-term support contracts. Amazon has some of that experience from its AWS hardware business, but selling a chip to a bank’s IT department is a different beast than renting a virtual machine to a startup.
There is also the software question. Nvidia’s real moat is CUDA, its software platform that makes it easy to run AI workloads on its GPUs. Amazon has its own software stack, called AWS Neuron, which is optimized for Inferentia and Trainium. But it is not as ubiquitous as CUDA. If Amazon wants to sell chips directly, it will need to invest heavily in making Neuron a first-class development environment that works outside the AWS ecosystem.
Finally, there is the delicate dance with AWS customers. Amazon is essentially saying, "We will rent you compute on our cloud, or we will sell you the chip to run yourself." That could confuse the market and anger some AWS partners who see the chips as a key reason to stay locked into the platform. But Amazon has always operated with a long-term view. If the AI chip market is the next gold rush, they want to sell picks and shovels to everyone—even the miners who dig their own holes.
For now, the plan is reportedly still in the exploratory phase. No formal product announcement has been made. But the signs are clear: Amazon is no longer just a cloud company. It is becoming a chip company. And if the move succeeds, it could reshape the entire AI hardware market, giving developers a real alternative to Nvidia and forcing every other chipmaker to rethink their strategy. The only question left is whether Amazon has the nerve to compete with itself.
Ahmed Abed – News journalist