Demand for AI processors will gain momentum, while the industry faces consolidation

Wylie Wong, Contributor

December 23, 2022

3 Min Read

The demand for specialized AI processors will continue to soar in 2023 as AI deployments surge across all types of enterprises, end users, cloud service providers and even telecommunications service providers, said Manoj Sukumaran, principal analyst, data center IT at sister research firm Omdia.

“The AI chip market is expected to continue the growth momentum it has gained in the last couple of years,” he said. “The size of some of these deep learning models are going beyond trillions of parameters, and it would need significant compute capacity to train and deploy these models.”

For example, Meta expects its deep learning recommendation models (DLRMs) to grow to over 10 trillion parameters in the near future, which requires a zettaflop of compute to train, said Sukumaran, a principal analyst of data center IT.

Here’s what Sukamaran expects for the AI chip market in 2023:

  • About 2 million servers shipped in 2023 will have at least one co-processor to accelerate some compute workload, a 53% growth compared to 2022. A significant majority of it will be GPUs, TPUs and specialized AI accelerators.

  • Nvidia’s H100 “Hopper” Tensor Core GPU will be commercially available in 2023. Intel’s first data center GPU, code-named Ponte Vecchio, is also expected during the first half of 2023, while Tesla’s DOJO supercomputer with its custom silicon Dojo D1 is expected in late 2023.

  • The AI chip market faces consolidation. During the last few years, silicon startups bloomed to cater to the AI processor market, but 2023 will be a tough year for some of them as venture funding dries up and most of them do not have major revenue streams yet.

As these startups struggle to sustain themselves in 2023, they may become acquisition targets as a result, said Vladimir Galabov, head of Omdia’s Cloud and Data Center Research Practice.

Consolidation previously occurred in the market when Intel acquired AI chipmaker Habana Labs for about $2 billion in 2019 and deep learning startup Nervana Systems for about $400 million in 2016, he said.

“It is not just the silicon, but a robust software stack to take advantage of the silicon capabilities is what differentiates companies in this market. And that is what exactly many of these startups are struggling with,” Sukamaran said. “If you look at the leader in the market, Nvidia, its biggest strength is software. Even their biggest competitors Intel and AMD do not have a robust software stack.”

That being said, there will be winners in niche market segments, Sukamaran said. Startups, such as Cerebras and SambaNova Systems, have found a niche market and have positioned themselves very well in the AI market, he said.

In 2023, data center operators will increasingly match specific AI processors with specific workload needs to maximize performance, Galabov said.

For example, at the recent AWS re:Invent conference, Amazon Web Services said it has chosen Intel’s Habana chips for machine learning models on vision because it provides better performance, but AWS will use its own Tranium chip for language because its own chip design is better for that purpose, he said.

On a broader scale, Moore’s Law is still alive, Galabov said.

Moore’s Law states that the number of transistors in an integrated circuit doubles every two years, meaning there needs to be about 100 billion transistors on an integrated circuit today.

In 2022, Apple unveiled the M1 Ultra, which consists of 114 billion transistors. AMD’s 4th Gen Epyc ‘Genoa’ chip features 90 billion transistors, while Intel’s forthcoming Ponte Vecchio chip has more than 100 billion transistors, he said.

“Those three products have kept us on track with Moore’s Law,” Galabov said.

It is possible that Moore Law’s will survive through 2024, Galabov added.

“If we managed to build a processor with 100 billion transistors today, Moore’s Law would suggest that in 2024, we have to reach 200 billion transistors. At the moment, I think it is likely we will get close. But by 2026, we’re probably going to struggle to keep up with Moore’s Law,” he said.

About the Author(s)

Wylie Wong

Contributor, AI Business

Wylie Wong is an award-winning freelance journalist specializing in technology, business and sports. He previously worked at CNET, Computerworld and CRN. An avid sports fan, he is the co-author of ‘Giants: Where Have You Gone?’, a book about the lost heroes and fan favorites of the San Francisco Giants baseball team.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like