(Bloomberg) — Advanced Micro Devices Inc., targeting a growing market dominated by Nvidia Corp., unveiled new so-called acceleration chips that it said will be able to run artificial intelligence software faster than competing products.
Most Read by Bloomberg
The company unveiled a much-anticipated series called the MI300 at an event held on Wednesday in San Jose, California. CEO Lisa Xu also gave an impressive prediction about the size of the AI chip industry, saying it could reach more than $400 billion in the next four years. That’s more than double a projection AMD gave in August, showing how quickly expectations for AI hardware are changing.
The launch is one of the most significant in AMD’s five-decade history, setting up a showdown with Nvidia in the hot market for artificial intelligence accelerators. Such chips help develop AI models by bombarding them with data, a task they handle more adeptly than traditional computer processors.
Building artificial intelligence systems that rival human intelligence — considered the holy grail of computing — is now possible, Su said in an interview. But the development of the technology is still just beginning. It will take time to assess the impact on productivity and other aspects of the economy, he said.
“The truth is, it’s so early,” Sue said. “This is not fashion. I believe that.”
AMD is showing growing confidence that the MI300 series can win over some of the biggest names in tech, potentially diverting billions in spending to the company. Customers using the processors will include Microsoft Corp., Oracle Corp. and Meta Platforms Inc., AMD said.
Nvidia shares fell 2.3 percent to $455.03 in New York on Wednesday, a sign that investors see the new chip as a threat. However, AMD shares did not rise accordingly. On a day when tech stocks were broadly bearish, shares fell 1.3% to $116.82.
Growing demand for Nvidia chips from data center operators has helped propel the company’s stock this year, pushing its market value above $1.1 trillion. The big question is how long it will effectively have the accelerator market alone.
AMD sees an opening: Large language models — used by AI chatbots like OpenAI’s ChatGPT — need a huge amount of computer memory, and that’s where the chipmaker believes it has an advantage.
The new AMD chip has more than 150 billion transistors and 2.4 times more memory than Nvidia’s H100, the current market leader. It also has 1.6 memory bandwidth, further boosting performance, AMD said.
Su said the new chip is equal to Nvidia’s H100 in its ability to train AI software, and much better at bottom line — the process of running that software once it’s ready for real-world use.
While the company expressed confidence in its product’s performance, Su said it won’t just be a competition between two companies. Many others will also be vying for market share.
At the same time, Nvidia is developing its own next-generation chips. The H100 will succeed the H200 in the first half of next year, giving access to a new type of high-speed memory. This should at least match some of what AMD has to offer. Nvidia is then expected to come out with an entirely new architecture for the processor later this year.
AMD’s prediction that AI processors will grow into a $400 billion market underscores the boundless optimism in the AI industry. That compares to $597 billion for the entire chip industry in 2022, according to IDC.
As recently as August, AMD had offered a more modest forecast of $150 billion over the same period. But it will take some time for the company to get a big piece of this market. AMD has said its own accelerator revenue will top $2 billion in 2024, with analysts estimating the chipmaker’s total sales to reach about $26.5 billion.
The chips are based on the type of semiconductors called graphics processing units or GPUs, which are commonly used by video gamers to get the most realistic experience. Their ability to perform a specific type of calculation quickly by doing many calculations simultaneously has made them the best choice for training AI software.
(Updates with the CEO’s comments starting in the fifth paragraph.)
Most Read by Bloomberg Businessweek
©2023 Bloomberg LP