In a recent earnings call, Advanced Micro Devices Inc. (AMD) revealed an ambitious projection for its first AI chips' revenue, and CEO Lisa Su seems capable of delivering on this promise. Su stated that the company could expect around $400 million in data-center revenue from its graphics processing units (GPUs) in the fourth quarter of this year. Furthermore, she projected that the total revenue for these chips could exceed $2 billion in 2024 as the demand continues to grow.
According to Su, the success of the MI300 chip series would mark the fastest product ramp to a billion-dollar sales milestone in AMD's history. This forecast surpassed earlier predictions made by Wall Street analysts. Wells Fargo analyst Aaron Rakers, for instance, noted that Su's projection surpassed his own estimate of $1.7 billion for AMD's data-center GPU revenue.
While AMD's overall outlook for the fourth quarter was less favorable due to challenges faced by its gaming and embedded-chip businesses, the upbeat AI discussion helped alleviate concerns among investors. Initially, the release of results caused AMD shares to decrease by about 5% after hours. However, the stock recovered and finished the extended session with less than a 1% decline, as the AI commentary provided some reassurance.
To support her optimistic forecast, Su highlighted AMD's ongoing engagements with customers regarding the MI300 chip family. She emphasized the strong demand for AI-focused chips and mentioned the "significant" progress that AMD has achieved thus far. Su expressed satisfaction with both the technical milestones and customer partnerships, which extend from hyperscalers to enterprise customers and new AI startups. However, no specific customer names were disclosed by the company thus far.
AI Chips: Fueling Growth for AMD's Data-Center Segment
AMD’s data-center segment, which reported flat revenue of $1.6 billion during the third quarter, has the potential to experience significant growth with the introduction of AI chips. Over the years, AMD has successfully established itself in the server market under the leadership of Lisa Su. As a result, enterprise customers are now more inclined to explore AMD's new chip offerings.
Targeting Inference Workloads with AI Chips
Lisa Su stated that AMD's latest graphics-processor chips, designed to accelerate AI, will cater to both inference and training workloads. However, she specifically emphasized their outstanding performance in inference workloads.
Nvidia's Dominance in Training Area Creates Opportunities for AMD
Within the AI training space, Nvidia currently holds a strong position and has observed substantial revenue growth from companies utilizing AI in their data centers. However, supply constraints have prompted customers to search for alternative options. Consequently, this presents an opportunity for AMD to enter the market.
Potential Challenges Ahead
Recent concerns have arisen among investors regarding the impact of the U.S. ban on certain advanced chips to China. This ban could potentially affect Nvidia's and AMD's future revenue streams. Additionally, Bernstein Research analyst Stacy Rasgon has pointed out several risks associated with AMD's AI ambitions, including delayed volume production and competition from Nvidia's upcoming Blackwell launch.
Confidence in AMD's Products and Customer Relationships
Despite these challenges, Lisa Su remains a cautious forecaster and exhibits high confidence in AMD's products and customer relationships. With hopes of becoming a prominent player in the growing AI sector, AMD's management is committed to capitalizing on the new market.
In conclusion, AI chips have the potential to drive growth within AMD's data-center segment. With Lisa Su's strategic leadership and the company's strong relationships with enterprise customers, AMD is well-positioned to capitalize on this emerging market opportunity.