Skip to main content

On The RAM Chokepoint

·860 words·5 mins·

Here I want to go back to one of my favourite topics, chokepoints in tech supply chains. In the wake of my last few posts on supply chain fragilities and temporal arbitrage, I’ve been reading up on other potential squeeze points in the hardware we (willingly or unwillingly) rely upon. The last couple of days hastened my typing, because, well, I am a tad worried. Let me elaborate.

We are wading into the beginning of a RAM shortage that reaches from AI data centres down to your washing machine. Yet, I haven’t seen much kicking and screaming about this – even though the implications are absolutely bound to be severe. And as many things these days, the fault (at least, partially) falls on the usual AI suspects: your mundane trinket that ‘just works’ because it has a few anonymous memory chips inside is now competing with AI clusters that can pay almost any price for the same silicon.

But it’s not only AI’s fault – as often happens, it’s choked supply. Three firms sit on roughly 96% of global DRAM production: Samsung, SK Hynix, and Micron (the first two are Korean, with Micron being American).[1] They have been here before: China launched a price-fixing probe into all three in 2018, and DRAM makers have repeatedly faced lawsuits and class actions accusing them of coordinating supply cuts to drive prices up. In other words, there isn’t a ‘natural scarcity’ of RAM, but a managed condition that just happens to map neatly (and plump up) onto the balance sheet.

Before the current situation, the plan within the memory triopoly was most likely to hold back on new factories, let a “disciplined” supply curve slowly tighten, and enjoy fatter margins over a couple of years. There (hopefully) wasn’t a memo outlining how to starve the world of RAM, just a lot of board talk about capex restraint, mix optimisation, and premium segments. Basically, a gentle squeeze. At any rate, if that hypothetical memo existed, the antitrust lawsuit would be open and shut. Then OpenAI walked in. In late 2025, Altman’s outfit locked in enormous, hush-hush memory deals with Samsung and SK Hynix something on the order of 40% of global DRAM capacity, down at the wafer level, not at the boxed DIMM level you can grab off a shelf. That move yanked a planned multi-year “melt-up” in prices into a single, violent demand shock. It also, arguably, means that Altman isn’t exactly confident in OpenAI’s value proposition (or underlying economics) and would rather squeeze out other players in the sector.

Sam’s Bet
#

Altman chose the most boring (compared to GPUs), least substitutable thing in the AI stack. Compute has alternatives: different GPU vendors, custom ASICs, clever software optimisations. But if your model wants hundreds of gigabytes, you feed it DRAM and High Bandwidth Memory (HBM), or you don’t train it at all. Lock up enough of that, and you don’t have to out-innovate your rivals; you can just make sure they can’t get the components to compete.

Memory vendors saw the same leverage and ran the numbers. HBM for AI accelerators throws off margins around 60%, versus 30-40% of good ol’ DDR5. That is why Micron, after riding an HBM-driven revenue wave (+127% YoY) announced literally yesterday that it would kill its 29-year-old Crucial consumer brand:

The AI-driven growth in data centres has driven a surge in demand for memory and storage. Micron has made the difficult decision to exit the Crucial consumer business to improve supply and support for our larger, strategic customers in faster-growing segments.

This is where the choke really starts to hurt, because RAM doesn’t only live in ‘computers’. Phones ship with 8–12GB of low-power RAM; smart TVs need a few gigabytes; EVs rely on around 90GB of RAM today and orders of magnitude more for the “AI everywhere” fantasy being pitched to investors. The IDC market projections now talk openly about phone prices jumping by around $70 on the back of memory alone, and low- to midrange Android handsets being squeezed hardest as component costs rise faster than retail prices can follow.

Stuck Until the Bubble Pops
#

Existing fabs are being retooled for HBM, and industry forecasts don’t show a clean, consumer-friendly supply expansion before 2027 (paywalled link, here’s an alternative), and even that assumes no further AI demand shocks. Logically, capacity only swings back toward DDR and low-power mobile RAM if HBM margins fall below what those ‘less strategic’ products can offer, which nobody betting on the current AI boom expects anytime soon.

Stuff won’t vanish overnight. Prices inch up, lead times stretch, and the bottom of the market gets hollowed out until “cheap but decent” becomes a nostalgic category. The quick phone swap becomes a scavenger hunt, the straightforward warranty turns into a months-long exchange of shrugs and boilerplate (if your appliance provider’s current stock runs out, that is), and the unconnected, un- “smart” appliances you actually wanted disappear because no one can justify wasting premium-margin RAM on products that don’t phone home.

Oh, and have I mentioned exactly the same thing is happening for NAND (i.e., the chips that make up your SSDs)?