When OpenAI released the ChatGPT chatbot about a year ago, the act unleashed a new craze on an unsuspecting world. Artificial intelligence (AI) continues to be the talk of the town, Wall Street and Silicon Valley, and this is especially true of the genetic AI technology that powers systems like ChatGPT.
Investors soon caught on Nvidia (NVDA -0.33%) provided the AI acceleration hardware that made it all possible, with spectacular results. Nvidia’s stock price has more than tripled in 2023, driven by actual sales of AI-related processors and the expectation of continued dominance in this hot market segment.
But Nvidia shouldn’t rest on its laurels. It’s not the only chip designer on the market, and it’s certainly not the only one interested in this lucrative AI opportunity.
The latest challenger to enter the ring and challenge Nvidia’s AI dominance is Samsung Electronics (SSNL.F -28.75%). The Korean tech titan has partnered with Naver (OTC: NHNC.F) — a homegrown online entertainment giant — for developing hardware and software that purport to match or exceed the best tools available today.
Specifically, Samsung and Naver claim that the upcoming AI chip will be eight times more energy efficient than Nvidia’s H100 accelerator.
That’s not the same thing as a performance record – but a more efficient solution might actually pose an even bigger threat to Nvidia’s throne. Here’s why.
The advantage of efficiency in AI computing
In the realm of high-performance artificial intelligence computing, efficiency is key. Pure performance doesn’t really matter, because you can always throw more hardware at the number collision problem.
The supercomputers that train AI systems like ChatGPT are equipped with thousands of Nvidia A100 accelerators with nearly 7,000 processing cores each. The real challenge is feeding enough power to run these beasts and then cooling the resulting space heater. The OpenAI/Nvidia system draws 7.4 megawatts at full power, comparable to a full cruise ship crossing the seas or a large steel mill.
Therefore, the AI giants are really looking for a power saving solution that can deliver better results per watt.
Samsung and Naver’s claim of an AI chip that is eight times more energy efficient than Nvidia’s H100 could represent a paradigm shift. In an increasingly energy- and cost-conscious world, a more efficient chip doesn’t just mean lower power bills. it means a smaller carbon footprint, a more compact physical installation, and the ability to deploy more powerful AI systems without prohibitive energy costs.
Nvidia’s dominance was challenged
Nvidia has long been the go-to provider of AI-accelerating hardware, which is reflected in its rising share price and market position. However, as Samsung and Naver enter the arena with their promise of a breakthrough energy-efficient AI chip, Nvidia faces a new kind of competition. It’s no longer just about who has the fastest chip on the market. it’s about who can deliver in the most efficient way possible. And this time, Nvidia may not be the clear winner.
This development is not just a two-horse race. Companies like AMD and Intel are also in the fray, each with their own AI chip solutions. The AMD Instinct MI300 series features more memory and lower power requirements than the previous generation. Intel’s Gaudi3 solution focuses on faster primary performance and a next-generation network that connects processors together. Everyone has a unique master plan.
But these alternatives never claimed to blow Nvidia’s power output out of the water. Samsung and Naver’s focus on low power requirement could set a new standard, forcing others to follow suit — but still giving the Korean duo a strong first-mover advantage. As AI technologies are increasingly integrated into a variety of sectors, from healthcare to finance, the demand for efficient, powerful and cost-effective AI computing will only grow.
What’s next?
This is all theory so far. Benchmark tests and real-world installation results will come in 2024 and beyond, as Nvidia’s rivals roll out mass-market volumes of their new AI chips. Investors are left to make educated guesses about how closely each company’s claims will match the bottom line, from energy uptake and raw performance to next-level connectivity and other potential game-changing ideas.
Will Samsung and Naver’s AI-centric chip deliver on its promises? How will Nvidia and other competitors react? Time will tell, but one thing is clear: The AI chip market is evolving, and with it, the landscape of AI itself. The coming years will be critical in determining the direction of this technology and its impact on the world.
I can’t say for sure which company (or companies) will dominate the AI hardware market in the long run, but Samsung has just joined the ranks of potential winners. If you didn’t think of Samsung as a top chip designer before, the company just joined that elite list.