If you've been fortunate enough to read Chris Miller's Chip War, you know that the memory chip industry is notorious for its boom and bust cycles.
From the 1970s through the 2000s, the memory industry was one of the fastest growing industries in the world, experiencing annual growth rates of up to 70%.
But despite creating immeasurable value for its clients and driving technological advances for society, McKinsey estimates that memory chip makers lost a combined $9.5 billion in investor wealth through multiple boom and bust cycles between 1997 and 2012.
But this time, things may really be different: Rising demand for increasingly powerful AI hardware and software is creating a bull market in memory chips, with sell-side analysts just now revising their forecasts upwards.
Boom and bust
But first, let's back up: How did this destruction of the capital city happen?
The main culprit was the cyclical dynamics that result from the nature of memory chips themselves: Despite being technically complex and crucial to the digital age, memory chips are essentially a commodity: Unlike logic chips or AI chips (the “brains” that perform computational tasks), memory simply stores data.
There are two types of memory chips:
Dynamic Random Access Memory (DRAM) stores data that is actively read by a device's operating system, while NotAnd (NAND) chips, which are used for long-term storage, come in a relatively limited variety of configurations with little variation between manufacturers.
Price is often used as a weapon, and Korean players, particularly Samsung, are notorious for continuing to invest throughout the cycle, even during periods of low utilization.
However, a look at fact sheets for the major UK technology investment trusts shows that Polar Capital Technology, Allianz Technology and Manchester & London all list Micron Technology in their top 10 holdings. Additionally, M&L and Allianz both hold Western Digital, with Allianz listing it in their top 10 holdings.
Cycle hit
This is interesting because technology fund managers tend to gravitate toward predictable, annuity-type income streams rather than choosing to ride cyclical roller coasters. It is also unusual to see such crossover between these three portfolios. It is also odd that none of the portfolios include the number one and number two players in this market, South Korea's Samsung and SK Hynix, in the top 10.
What on earth is going on, and why are these three fund managers suddenly attracted to this sector?
There is no doubt that the industry has become more streamlined, with the number of players reduced to just three major companies – the two Korean companies and Micron – and the market structure becoming more oligopolistic.
The evidence can be seen in the public message that these three companies will be disciplined on capacity and pricing. In other words, Korean companies have decided that it is better to own 40% of a high-margin industry than to have a complete monopoly on a low- or negative-margin industry.
Moreover, since the memory industry is only just coming out of a post-COVID downturn, we may now be entering the up phase of the cycle: prices have risen from the bottom of the last cycle, with spot prices for NAND and DRAM currently estimated to be up 191% and 26%, respectively.
Growth Profiling
While these two points make investing in this industry more attractive, I believe the real reason investors are entering this industry is the changing growth profile of this sub-sector.
Historically, memory has been used in PCs and servers, but in the future we are likely to see a surge in memory demand from all forms of AI across servers, edge, mobile, etc. This has led sell-side analysts to revise their memory growth forecasts to 15-20% per year over the next five years and start talking about this subsector enjoying secular growth rather than cyclical.
Memory is even more important to AI chips—graphical processing units (GPUs) and specialized AI accelerators—than it is to general purpose central processing units (CPUs) for several important reasons.
- While a CPU has only a few cores, AI chips have parallel architectures with thousands of cores designed to perform matrix multiplication operations in parallel on large datasets. This parallelism requires extremely high memory bandwidth to keep all the cores fed with data.
- AI workloads, especially training large neural networks, require huge amounts of data to be stored and accessed quickly. The memory bandwidth and capacity requirements are orders of magnitude higher than traditional CPU workloads. The Nvidia A100 GPU chip can deliver 2 terabytes of memory bandwidth per second, more than 20 times the 90 gigabyte requirement of high-end CPUs.
- Today's largest AI models have billions, or even trillions, of parameters that must be stored in memory during training. Training these huge models on a CPU with limited memory bandwidth becomes prohibitively slow.
- The memory subsystem accounts for a large portion of the chip area, power, and cost of modern AI accelerators, reflecting its importance: optimizing memory bandwidth and capacity is a key design priority to reduce power, which is crucial to the total cost of semiconductor ownership.
Now sellers are starting to notice this gradual shift in demand. Micron shares rose 5.5% today to a record high of $156 after brokers Cantor Fitzgerald and Susquehanna Financial raised their target prices yesterday, citing DRAM and NAND memory price movements. The hike also helped SK Hynix shares hit a record high in Seoul.
High places
Then came High Bandwidth Memory (HBM), which stacks up to 12 Dram memory chips vertically to reduce the distance data has to travel and provide a smaller form factor, resulting in even higher bandwidth, capacity, performance, and lower power consumption.
HBM is increasingly being used to power machine learning, high-performance data centers, and more recently generative AI models, with memory players reallocating more than 20% of their total DRAM wafer supply here, leading analysts to predict double-digit price increases in both the second and first half of this year, with further increases predicted in 2025.
In other words, it doesn't get much better than this for the “big three” of memory.
As for why these mutual funds prefer Idaho-based Micron over South Korean companies, it could reflect geopolitical tensions and efforts by the U.S. government to bolster the domestic semiconductor industry.
So if you feel like you missed out on the AI chip boom (we don't), this rally is early in the game and the valuation metrics are great.
Sure, Micron's stock may have doubled over the past year, but its EBITDA multiple is still 10x, higher than Nvidia's 35x.
Secret Tech Investor is a seasoned professional who has been managing technology assets for over 20 years.