Nvidia: H100 beats its current fastest AI chip by 4.5 times

Nvidia holds first public demonstration of the GPU.

Ben Wodecki

September 12, 2022

1 Min Read

Nvidia holds first public demonstration of the GPU.

Nvidia has given the tech world a glimpse at its upcoming H100 GPU after revealing its new performance records on the MLPerf benchmark.

According to Nvidia, the H100 ‘Hopper’ Tensor Core GPU sets “world records” for its MLPerf test, recording performance results up to 4.5 times faster than its current fastest AI chip, the A100.

Nvidia said the H100 “excelled” on the BERT model for natural language processing, one of the largest and most compute-intensive MLPerf AI models.

The company puts its performance down to its ‘Transformer Engine' which is designed for AI inferencing.

The H100 will boast 80 billion transistors — almost 70% more than the 7nm A100 GPU. Able to provide up to 30 teraflops of peak standard IEEE FP64 performance, the chip is expected to be highly sought after for those utilizing AI and supercomputer applications.

This marks the first public demonstration of H100 GPUs; it will hit the market later this year.

However, Nvidia warned that development of the product line could be impacted by new export restrictions.

The U.S. is imposing a new license requirement on Nvidia when exporting its AI chips to China and Russia that could be diverted for military use, according to a Securities and Exchange Commission filing.

The chipmaker said the requirement could impact chip sales. Nvidia’s third-quarter revenue outlook includes around $400 million in potential sales to China. It does not have clients in Russia.

However, Nvidia is seeking a U.S. government exemption for its “internal development and support activities” and it is engaging with Chinese customers to buy data center products not subject to the new license.

About the Authors

Ben Wodecki

Assistant Editor

Get the newsletter
From automation advancements to policy announcements, stay ahead of the curve with the bi-weekly AI Business newsletter.