In a world where artificial intelligence (AI) is rapidly reshaping industries, a recent shockwave hit the semiconductor market. Stocks for major AI hardware players like Nvidia (NVDA), Advanced Micro Devices (AMD), and Micron Technology (MU) plunged dramatically following the emergence of a game-changing competitor. DeepSeek, a Chinese AI start-up, unveiled a new approach to training AI models that could dramatically reduce the costs traditionally associated with this technology. Investors were quick to fear that this innovation could undermine demand for graphics processors (GPUs)—the very hardware that powers AI—and by extension, the stock prices of companies that produce them.
However, despite the panic, recent statements from Meta’s CEO, Mark Zuckerberg, are providing some much-needed clarity and reassurance to investors. Zuckerberg’s expert insight not only mitigates some of the fears about AI hardware demand but suggests that the future of Nvidia, AMD, and Micron remains bright, particularly in light of the growing AI and data center infrastructure needs.

The initial shock came from DeepSeek’s announcement that its V3 large language model (LLM) was able to match the performance of the most cutting-edge models, such as GPT-4, at a fraction of the cost. While that in itself isn’t cause for alarm, what raised eyebrows was the price tag—DeepSeek reportedly spent just $5.6 million to train its model. This is in stark contrast to OpenAI, which has poured more than $20 billion into developing its models over the years.
Even more concerning for Nvidia, AMD, and Micron investors was the fact that DeepSeek achieved this feat using older generation GPUs that were not subject to the U.S. government’s export bans on advanced GPUs to Chinese companies. Rather than relying on the latest high-performance hardware from Nvidia and AMD, DeepSeek used older, less powerful hardware like the H100 and H800 chips.
This sparked widespread concern that if other AI companies followed DeepSeek’s example, they might be able to run powerful AI models without the expensive, cutting-edge hardware currently supplied by Nvidia, AMD, and Micron. Could this signal the beginning of a reduced need for these companies’ products, and with it, a dramatic shift in the market?
Despite the initial panic, Mark Zuckerberg’s comments on January 29, 2025, during a conference call to discuss Meta’s fourth-quarter performance, have offered investors some much-needed reassurance. Zuckerberg’s message? The demand for AI chips is unlikely to decline, and in fact, may shift in ways that benefit leading semiconductor companies like Nvidia, AMD, and Micron.
While it’s true that DeepSeek’s more efficient AI model training techniques could reduce the overall need for computational resources in the training process, Zuckerberg explained that the demand for AI chips won’t disappear entirely. Instead, the focus will likely shift from training AI models—where enormous amounts of data and computational power are required—to inference. Inference refers to the process by which AI models use the data they’ve been trained on to process user inputs and generate outputs.
Zuckerberg pointed out that as the industry moves toward reasoning—where AI models take more time to “think” before rendering outputs—there will still be a need for powerful chips to handle these processes. This shift from data-intensive training to inference does not mean less capacity is required. On the contrary, the need for more inference compute—the type of computational power needed to execute AI reasoning—will only increase as AI models become more advanced and capable.
For Meta, a company that spends billions on AI infrastructure and data centers, this means continued demand for top-tier GPUs and data center components. Meta’s ambitious plans for its own Llama LLMs, which are gaining traction worldwide, make it clear that high-end infrastructure is still vital for achieving the next generation of AI capabilities.
Zuckerberg’s insights serve as a strong counterpoint to the initial fears triggered by DeepSeek’s announcement. They highlight that while AI may evolve and become more efficient, it doesn’t signal the end for Nvidia, AMD, or Micron. In fact, the shift toward more advanced AI inference models could increase the demand for high-performance chips, which these companies are well-positioned to provide.
Nvidia has already experienced explosive growth, with its fiscal year 2025 revenue expected to reach a record $128.6 billion, much of it driven by data center sales of its GPUs. Over the past two years, Nvidia has added $2.5 trillion to its market capitalization, a testament to its pivotal role in the AI revolution. Meanwhile, AMD is emerging as a formidable competitor in the GPU market, planning to launch its new MI350 GPU in 2025, which could rival Nvidia’s Blackwell chips for processing AI workloads. Micron, often overlooked in the AI conversation, plays an essential role by providing high-bandwidth memory (HBM3E) for data centers—a crucial component for ensuring AI chips operate at their best.
The AI market is still in its early stages, and innovations like DeepSeek’s offer a glimpse into the future of AI model training. However, this does not diminish the need for powerful hardware—if anything, it underscores the importance of staying at the cutting edge of AI technology. For companies like Nvidia, AMD, and Micron, the future remains bright. As AI moves beyond basic model training and focuses on reasoning and inference, the demand for the most advanced hardware will grow, particularly as the broader AI ecosystem scales up to serve millions of users.
Zuckerberg’s outlook suggests that companies will need even more sophisticated data center infrastructures to support the advanced reasoning capabilities of AI. For investors, this is a long-term trend that promises sustained growth for Nvidia, AMD, and Micron as AI continues to evolve.
If you’re an investor, entrepreneur, or tech enthusiast seeking to stay informed about the latest trends in technology, innovation, and AI, InnovationTimes is your go-to source for timely, expert insights. Don’t miss out on the stories that matter. Join our community of forward-thinking leaders and decision-makers who rely on our in-depth analysis to guide their investments and business strategies.
Sign up now for exclusive insights and the latest in tech, leadership, and business news at InnovationTimes. Stay informed and make smarter decisions in the world of technology and investment.
Stay ahead with the latest news on global innovation, leadership, entrepreneurship, business, and tech. Join us on WhatsApp or Telegram for real-time updates. Have a report or article? Send it to report@theinnovationtimes.com.
Follow us on X (Twitter), Instagram, LinkedIn, YouTube, Pinterest and Facebook for more insights and trends