Published Date::17/10/2024
AMD is making a significant push into the Artificial Intelligence (AI) chip market, aiming to challenge Nvidia's leadership. The company recently unveiled its new Instinct MI325X AI chip, touting its superior specifications compared to Nvidia's current flagship, the H200. However, the market's response has been mixed, and questions loom about AMD's ability to gain a substantial share in this highly competitive space.
AMD's MI325X Impressive on Paper, but Late to Market
AMD claims that its MI325X data center accelerator comes equipped with 256 gigabytes (GB) of High Bandwidth Memory 3 Extended (HBM3E), offering a bandwidth of 6 terabytes (TB) per second. This is a significant improvement over Nvidia's H200, which provides 141GB of HBM3E memory and a bandwidth of 4.8TB/second. According to AMD, the MI325X not only has 1.8 times more memory capacity and 1.3 times more bandwidth but also promises 1.3 times greater compute performance and similar advantages in AI inference tasks.
The company plans to start production of the MI325X in the current quarter, with widespread availability expected in the first quarter of 2025. However, the timing of the launch is a crucial factor. Nvidia's H200 was announced nearly a year ago and started shipping in the second quarter of 2024, giving it a significant head start. This delay of around nine months could undermine AMD's competitive position.
Nvidia’s Next-Gen Blackwell Processors A Bigger Threat
While AMD is making strides, Nvidia is not standing still. The company is set to introduce its next-generation Blackwell processors, including the B200, which will pack 208 billion transistors—significantly more than the 153 billion on AMD's MI325X. The B200 will be manufactured using Taiwan Semiconductor Manufacturing's (TSMC) advanced 4-nanometer (nm) 4NP process node, compared to AMD's 5-nm process. This means that Nvidia's new chip will likely offer more computing power and better power efficiency.
The B200 is also expected to provide a higher memory bandwidth of 8TB/second, further solidifying Nvidia's technological lead. Nvidia is optimistic about the Blackwell's revenue potential, expecting to generate several billion dollars in the fourth quarter of its fiscal year, which runs from November 2024 to January 2025.
The Silver Lining for AMD Investors
Despite the challenges, there are reasons for AMD investors to remain optimistic. Nvidia's dominance in the AI chip market is so significant that it is projected to generate almost $100 billion in data center revenue this fiscal year. AMD, on the other hand, is targeting a more modest but still substantial $4.5 billion in AI data center GPU revenue for 2024.
The AI GPU market is expected to be worth $500 billion by 2028, and AMD doesn't need to outperform Nvidia to see significant growth. By becoming the second-largest player in this market, AMD could capture a significant share and generate substantial revenue. For instance, if AMD can capture just 10% of the AI GPU market by 2028, it could see annual revenue of $50 billion from this segment.
Strong Demand for H200 and MI325X
Nvidia's H200 processors are expected to continue selling well even as the Blackwell chips roll out. According to Nvidia, shipments of Hopper-based chips, including the H200, are set to increase in the second half of the fiscal year. This suggests that there is still room for AMD's MI325X in the market, despite the competition from more advanced processors.
Future Growth and Diversification
AMD's growing presence in the AI GPU market is expected to drive its overall growth. The company's revenue is forecast to increase by 13% in 2024 to $25.6 billion, with even better prospects for the following years. Moreover, AMD can leverage its expertise in AI beyond GPUs, such as in AI-enabled PCs and server processors.
Conclusion
While AMD's MI325X may not be enough to dethrone Nvidia as the leader in the AI chip market, it has the potential to give the company a substantial boost. Investors should keep an eye on AMD's progress in this space, as the company continues to innovate and capitalize on the burgeoning AI market.
Q: What is AMD's latest AI chip called?
A: AMD's latest AI chip is called the Instinct MI325X.
Q: How does AMD's MI325X compare to Nvidia's H200 in terms of memory and bandwidth?
A: AMD's MI325X has 256GB of HBM3E memory and a bandwidth of 6TB/second, which is 1.8 times more memory capacity and 1.3 times more bandwidth compared to Nvidia's H200, which has 141GB of HBM3E memory and a bandwidth of 4.8TB/second.
Q: When will AMD's MI325X be widely available?
A: AMD's MI325X is expected to be widely available in the first quarter of 2025.
Q: What is the expected memory bandwidth of Nvidia's B200 Blackwell processor?
A: Nvidia's B200 Blackwell processor is expected to offer a memory bandwidth of 8TB/second.
Q: What is the projected size of the AI GPU market by 2028?
A: The AI GPU market is expected to be worth $500 billion by 2028.