Advanced Micro Devices (AMD) has intensified its challenge to Nvidia's AI chip dominance with the launch of its new Instinct MI350 series accelerators, which CEO Lisa Su claims outperform Nvidia's competing Blackwell processors.
At AMD's 'Advancing AI 2025' event in San Jose on June 13, Su unveiled the MI350X and MI355X accelerators, which began shipping earlier this month. The new chips deliver a remarkable 35-fold increase in inferencing capabilities compared to their predecessors, representing AMD's most significant generational performance leap in the company's history.
The MI350 series features 288GB of HBM3E memory—more than Nvidia's individual Blackwell GPUs—and supports new data formats including FP4 and FP6. AMD claims the MI355X can deliver 40% more tokens per dollar than Nvidia's chips due to lower power consumption, potentially offering significant cost advantages for AI deployments.
Su also announced the AMD Developer Cloud, a new service allowing developers to access AMD's MI300 and MI350 GPUs via cloud computing without purchasing the hardware directly. This follows a similar offering from Nvidia and aims to expand AMD's footprint in the AI ecosystem by making its hardware more accessible to developers.
Looking ahead, Su previewed AMD's next-generation MI400 line of chips and 'Helios' AI rack infrastructure, scheduled for launch in 2026. The MI400 will feature up to 432GB of HBM4 memory with speeds reaching 19.6TB per second, designed to compete with Nvidia's GB300 Blackwell Ultra processors and upcoming Rubin AI GPU.
The stakes in the AI chip market continue to rise dramatically, with Su now predicting the market will exceed $500 billion by 2028—a figure she noted was once considered unrealistically large. AMD's aggressive push into AI acceleration comes as the company seeks to capitalize on growing demand from major technology firms like Microsoft, Meta, and OpenAI, whose CEO Sam Altman appeared on stage with Su to announce OpenAI would use AMD chips.