Nov 13, 2023

Nvidia unveils H200, its newest high-end chip for training AI models

On Monday, Nvidia introduced the H200, a state-of-the-art graphics processing unit (GPU), which marks a significant advancement in the realm of artificial intelligence, particularly in powering the ongoing generative AI boom. This new GPU is an upgrade from the H100, the latter being instrumental in the development of OpenAI's GPT-4, and is eagerly anticipated by […] 
 Nov 13, 2023

Nvidia unveils H200, its newest high-end chip for training AI models

On Monday, Nvidia introduced the H200, a state-of-the-art graphics processing unit (GPU), which marks a significant advancement in the realm of artificial intelligence, particularly in powering the ongoing generative AI boom. This new GPU is an upgrade from the H100, the latter being instrumental in the development of OpenAI's GPT-4, and is eagerly anticipated by a wide range of entities, from large corporations to startups and government agencies.

The H200 stands out with its incorporation of 141GB of advanced “HBM3” memory, a feature designed to enhance the chip's performance in “inference” tasks – using trained AI models to generate text, images, or predictions. According to Nvidia, the H200 can deliver outputs nearly twice as fast as its predecessor, a claim substantiated by tests using Meta’s Llama 2 large language model (LLM).

The cost of these GPUs is substantial, with H100 chips priced between $25,000 and $40,000. However, the investment is deemed worthwhile by many, considering the significant capabilities these chips offer in creating and running large AI models. Nvidia's stock has surged more than 230% in 2023, reflecting the high demand and enthusiasm for its AI-focused GPUs. The company anticipates a staggering $16 billion in revenue for its fiscal third quarter, a 170% increase from the previous year.

The H200, expected to ship in the second quarter of 2024, will face competition from AMD’s MI300X GPU, which also boasts additional memory over its predecessors. A notable feature of the H200 is its compatibility with the H100, allowing for seamless integration into existing AI systems without the need for changes in server systems or software.

Nvidia plans to offer the H200 in four-GPU or eight-GPU server configurations in its HGX complete systems, as well as in a standalone chip, the GH200, which combines the H200 GPU with an Arm-based processor.

However, Nvidia's rapid development cycle hints that the H200 may soon be eclipsed. The company, adapting to high demand, has shifted from a two-year architecture cycle to an annual release pattern. With this change, Nvidia is set to unveil its next-generation B100 chip, based on the forthcoming Blackwell architecture, in 2024. This accelerated innovation cycle underscores the fast-paced evolution in AI and GPU technology, where significant performance gains are continually sought after.

Find your merch

EYL MERCH

New Styles 2024

Shop EYL Merch and Assets over Liabilities Collections
Shop Now

RELATED NEWS

Shop EYL Merch

SHOP ALL >
Join The Earn Your Leisure Newsletter

Stay up-to-date with the latest news in Business, Investing and Real Estate.

*by clicking Subscribe you agree to our Terms of Service and Privacy Policy