$360 Billion. That’s How Much Micron Added Since One Market Low.
Micron has nearly doubled since the March 30 market low, adding more than $360 billion in market value. That number alone should make anyone who wrote off memory chips as a boring corner of the AI trade feel a little uncomfortable right now.
But here at agntbox.com, we review what actually works — not what gets the most hype. And the story unfolding in AI stocks right now mirrors something I see constantly when testing AI toolkits: the flashy names grab the headlines, and the unglamorous infrastructure quietly does the heavy lifting.
The Trade Nobody Was Watching
Since ChatGPT launched in November 2022, two names have outperformed both Nvidia and Micron in the AI trade: Western Digital and Seagate. Yes, those Western Digital and Seagate. The companies you associate with external hard drives sitting in a desk drawer. According to Yahoo Finance data, both storage giants have surpassed the GPU darling and the memory chip star over that same period.
That’s not a typo. That’s not a misread chart. Storage companies — the ones making spinning disks and flash drives — have been among the strongest performers in the AI trade since the moment the world decided large language models were going to change everything.
So what happened? And more importantly, what does it tell us about where the real value in AI infrastructure actually sits?
Why Storage Makes Sense When You Think About It
When I test AI tools for this site, one thing becomes obvious fast: these systems are hungry. Not just for compute, but for data storage and retrieval at a scale that most people don’t think about until something breaks. Every model needs training data stored somewhere. Every inference call pulls from somewhere. Every enterprise deploying AI at scale is generating logs, outputs, and datasets that have to live somewhere.
Nvidia gets the glory because GPUs are the visible engine. But storage is the fuel tank. You can have the fastest engine on the track and still lose the race if you run dry on lap three.
Western Digital and Seagate are building those fuel tanks. And as AI workloads scale from research labs into enterprise production environments, the demand for high-capacity, high-throughput storage has grown alongside it. The market, apparently, noticed before most commentators did.
Micron Is Not Sitting Still Either
To be fair to the chip side of this story, Micron is not exactly struggling. Analysts tracking the stock in 2026 have pointed to exponential earnings growth as a reason Micron could double by year end. The company has already jumped impressively this year, and its memory chips are deeply embedded in the AI supply chain — particularly in the high-bandwidth memory that modern AI accelerators depend on.
So the picture is more nuanced than a simple “storage beats chips” narrative. What’s actually happening is that the AI trade is broadening. The early thesis was simple: AI needs GPUs, buy Nvidia. That thesis wasn’t wrong, but it was incomplete. The market is now pricing in a more thorough view of what AI infrastructure actually requires.
What This Means If You’re Watching the AI Space
From where I sit reviewing AI toolkits, the lesson maps directly onto how I evaluate products. The tools that get written up in every newsletter are not always the ones delivering the most value to the people actually building things. Sometimes the most important piece of your stack is the one nobody’s writing think-pieces about.
- Storage demand scales with every new AI deployment, not just new model releases.
- Western Digital and Seagate have outperformed Nvidia since the ChatGPT launch, per Yahoo Finance data.
- Micron has added more than $360 billion in market value since the March 30 low and may double by end of 2026.
- The AI infrastructure trade is no longer a single-stock story.
The Honest Take
Nobody was putting Western Digital on their AI watchlist in late 2022. That’s exactly the point. The AI trade has always been bigger than its most famous names, and the market is slowly catching up to what the actual infrastructure requirements look like at scale.
Whether you’re picking stocks or picking tools, the same principle applies: follow the workload, not the hype. The workload needs compute, yes. But it also needs memory, and it needs somewhere to put all that data when the GPU is done with it.
Turns out, the boring stuff was never that boring.
🕒 Published: