AMD predicts that MI300X will make the company earn $2 billion revenue in 2024 because of the strong pull from the customers. The processor is set for a release at the AMD Advancing AI event, on December 6.
“We are seeing a tremendous pull for MI300X,” AMD CTO Mark Papermaster told AIM, talking about how companies such as Lamini, Moreh, and Databricks are already announcing their success stories after using MI250x for training their generative AI models.
“In fact, Lisa Su (CEO) has said that MI300X is going to be the fastest-growing product in the history of AMD,” added Papermaster.
Another reason for the success of AMD’s hardware is the success of its software stack, ROCm (AMD Radeon Open Compute), which Papermaster highlighted has reached high performance compute production level. Moreover, he added that the next version of ROCm, the 6.0, would be released soon, and would be production-level ready for AI workloads.
‘We are bringing a leadership product‘
At Ignite 2023, Microsoft announced that it would be using MI300X for their AI workloads. Interestingly, at the same conference Microsoft also said that it would be using the NVIDIA GH200 superchips for the same purpose. “Competition is good. It brings innovation,” said Papermaster. “It brings pricing that ensures value for the customers and spurs the industry forward.”
“We have not only brought competition, but are also bringing in a leadership product in inference applications,” he further added about MI300X. The upcoming GPU will be recognised for its leadership in both training and inference applications, solidifying AMD’s position in the AI hardware landscape.
“MI300X is a high performance GPU, capable of being scaled out to very large cluster sizes,” he further spoke about how AI is evolving and the company is also focusing on edge computing on smaller sizes within laptops.
AMD highlighted the widespread adoption of Ryzen AI, the inaugural dedicated AI accelerator available on an x86 processor. With over 50 systems now equipped with Ryzen 7000 Series processors and Ryzen AI, millions of AMD AI PCs are available in the market.
“Historically, if you go back in time, most AI inference seems done to the CPU. And still, today, at least we fully support inferencing on AMD CPUs. But particularly, generative AI applications are in much higher demand for computing capability. And so when you look at the generative AI, training and inferencing it requires acceleration, and that is where we’re bringing in competition with AMD instinct roadmap GPU,” he said.
“What I want to highlight is that AMD has a broader strategy for AI than just GPUs. This is what makes us different from NVIDIA,” Gilles Garcia, senior director business lead, data center communication group at AMD, told AIM.
AMD believes that most of the AI workloads can be handled by CPUs alone. “CPUs are best for handling most of the problems with current edge processing such as thermal management, cost effectiveness, and reducing the footprint by working on edge,” said Garcia.
The startup and open source community in India
Focusing on the recent acquisition of Nod.ai, a software stack company that is now helping AMD, Papermaster said that AMD is always looking at the startup community in India. “We have a strong design presence in India. The country will, of course, be central for our AI product development, hardware and software product development efforts.”
Highlighting how Hugging Face also started using AMD GPUs for testing, he added that with the explosion of the AI ecosystem, AMD is very much committed to an open ecosystem. “We are not proprietary or close,” he added.
“Rather than needing just one specific partnership to get access to AMD, we’ve differentiated because we’re strongly committed to open source software and to open collaborations. I expect many such collaborations between AMD and India based on the strong adoption in India, and support of open source.”
To expand its research and engineering operations in India, AMD inaugurated its largest global design centre in Bengaluru. The AMD Technostar R&D campus is a key component of the company’s $400 million investment in India over the next five years. The campus plans to accommodate around 3,000 new employees at AMD which will focus on R&D.
“We are very focused on workplace development here in India,” Papermaster continued. “Jaya Jagadish, our country head and senior vice president, was the leader of a government panel, which studied workforce development. She has made very specific recommendations to the government of India.”
Papermaster says that AMD has very innovative workforce development programmes, which the company is continually employing in India. “We have strong relationships with universities and we are also providing additional training to students. Then we bring them onto a very established internship programme,” he explained about how AMD has established an excellent pipeline for college graduate engineering in India.