Advanced Micro Devices (AMD) has outlined an aggressive roadmap for developing powerful new artificial intelligence chips over the next two years with an aim of mounting a significant challenge to Nvidia's commanding lead in the AI semiconductor market.
At the Computex technology trade show in Taipei, AMD unveiled the MI325X accelerator, a high-performance AI processor scheduled for release in the fourth quarter of 2024. The chip is designed to meet soaring demand for advanced silicon capable of powering AI workloads and generative AI applications in data centres.
Nvidia currently dominates the lucrative AI chip space with an estimated 80 per cent market share. However, AMD has made AI a central focus as it looks to offer a compelling alternative and grab market share from its rival.
"AI is our top priority as a company," said AMD CEO Lisa Su during a presentation. "We have concentrated our development resources on consistently delivering new, more capable AI products on an annual cadence to stay competitive."
Following the MI325X, AMD plans to launch the MI350 chip series in 2025 based on a new computing architecture. The company claims this next generation will provide a 35-fold increase in AI inference performance compared to the existing MI300 products.
Looking further ahead to 2026, AMD previewed the MI400 series underpinned by an architecture dubbed "Next". While light on details, these future chips aim to further boost AMD's AI computing prowess as it goes head-to-head with Nvidia's offerings.
Nvidia has already signalled its intention to shorten product cycles, with CEO Jensen Huang revealing plans for a next-gen AI platform called Rubin in 2026 spanning GPUs, CPUs and networking silicon.
The race to develop cutting-edge AI chips has captured substantial investor interest and capital. Reflecting this, AMD's Su projected around $4 billion in AI chip revenue for 2024 during the company's latest earnings call – a $500 million increase over the prior forecast.
While data centres deploying AI remain the prime target for dedicated accelerators, AMD is also integrating AI capabilities into its mainstream CPUs launching in the second half of 2024. Additionally, the company is developing neural processing units (NPUs) specifically tailored for on-device AI compute on PCs from vendors like HP and Lenovo.
As generative AI goes mainstream, driven by applications like ChatGPT, the high-stakes battle for supremacy in AI silicon is intensifying. AMD is pulling out all the stops to dethrone Nvidia and secure a bigger slice of this rapidly expanding, lucrative market.
Recent Stories