The semiconductor industry is the beating heart of the artificial intelligence (AI) revolution. Most investors are focused on Nvidia (NASDAQ: NVDA) -- and rightly so, because it makes the best data center chips for developing AI -- but it isn't the only semiconductor company cashing in on this technology boom.
Micron Technology (NASDAQ: MU) is a leading supplier of memory and storage chips, which have become important components of the AI hardware story. As a result, the company's data center revenue is soaring at the moment, but it also faces significant AI opportunities in the smartphone and personal computing (PC) segments.
Micron reported a strong set of financial results for its fiscal 2025 first quarter (ended Nov. 28) on Wednesday, Dec. 18, but its stock plunged 12% in after-hours trading. However, the Nasdaq-100 technology index sank 3.6% on the day, its second-worst drop in 2024 so far, so it was a bad day for the stock market overall.
Therefore, this might be a golden opportunity for investors as we head into 2025. The Wall Street Journal tracks 43 analysts who cover Micron stock, and the overwhelming majority have assigned it the highest-possible buy rating. Here's why the Street is so bullish.
Memory chips complement the graphics processors (GPUs) supplied by Nvidia. They store information in a ready state so it can be called upon instantly, which is essential in data-intensive AI workloads. Since many AI models now rely on trillions of data points, they require significant memory capacity.
Micron's HBM3E (high-bandwidth memory) solutions are the best in the industry, providing 50% higher capacity than competing hardware while consuming 30% less energy. That's why Nvidia chose Micron's HBM3E to power its new Blackwell GB200 data center GPU, which is the company's most powerful AI chip to date.
Micron is completely sold out of its data center memory chips until 2026, but it isn't resting on that success. It's already working on a new HBM4E solution, which will offer a 50% leap in performance compared to its HBM3E hardware. The market for data center HBM is worth around $16 billion annually right now, but Micron predicts that number will grow to $100 billion by 2030. Maintaining a technological edge will be key to capturing as much of that value as possible.
But Micron's AI opportunity transcends the data center, because PCs and smartphones are capable of processing some AI workloads on-device without requiring external computing power. The company says PCs fitted with AI processors require DRAM memory capacity of between 16 and 24 gigabytes, compared to an average DRAM content of 12 gigabytes in non-AI PCs last year. Higher capacity translates into more expensive DRAM chips, which means more revenue for Micron.