Can Nvidia's H200 Chip Become The New Maestro Of AI Model Training?
By TD NewsDesk
Updated on Tue, Nov 14, 2023
On Monday, 13th November, Nvidia introduced the H200, a graphics processing unit (GPU) optimized for the training and deployment of AI models.
OpenAI's most sophisticated large language model, GPT-4, was trained on the H100, the new H200 GPU being its improved version. Financial services firm Raymond James estimates that the price of one H100 chip is somewhere around $30,000 and will take thousands of H200 chips operating in tandem to generate sufficiently large models for use in "training."
Excitement about Nvidia's AI GPUs has accelerated the company's stock, which is up more than 230% so far in 2023. Revenue for Nvidia's fiscal third quarter is projected to be over $16 billion, up 170% year over year.
For "inference," or employing a huge model after it has been trained to generate text, graphics, or predictions, the H200's 141GB of next generation "HBM3" memory is a major boost.
Nvidia claims the H200's output speed is approximately twice that of the H100's. A test with Meta's Llama 2 LLM supported that conclusion.
The H200 will go up against AMD's MI300X GPU and is scheduled for release in the second quarter of 2024. “To create intelligence with generative AI and HPC applications, vast amounts of data must be efficiently processed at high-speed using large, fast GPU memory,” said Ian Buck, vice president of hyperscale and HPC at NVIDIA. “With NVIDIA H200, the industry’s leading end-to-end AI supercomputing platform, just got faster to solve some of the world’s most important challenges.”
So, what can we anticipate from Nvidia’s new AI chip?
Nvidia has stated that the H200 is backward-compatible with the H100, meaning that businesses utilizing artificial intelligence won't have to update existing server infrastructure or training software in order to take advantage of the new model.
According to Nvidia, the H200 GPU will also be integrated into the GH200 chip, which connects the H200 GPU with an Arm-based processor and will be offered in four-GPU and eight-GPU server configurations on the company's HGX full systems.
The H200 is currently the fastest Nvidia AI chip - but for how long?
What were the concerns for Nvidia?
Companies such as Nvidia offer a wide variety of chip configurations. Yet, new semiconductors typically make a significant leap forward every two years when manufacturers adopt a new architecture that allows for much larger performance gains than can be achieved by simply adding memory or making other minor optimizations. The H100 and H200 are built on the same architecture developed by Nvidia. Moreover, as demand for its GPUs increased, Nvidia said in October that it would shift from a two-year architecture cycle to a one-year release rhythm which will take some adjustments.
Can Nvidia turn up the AI heat with its high-end H200 chip for training AI models?
Let us know your thoughts in the comments section below!
First published on Tue, Nov 14, 2023
Enjoyed what you've read so far? Great news - there's more to explore!
Stay up to date with the latest news, a vast collection of tech articles including introductory guides, product reviews, trends and more, thought-provoking interviews, hottest AI blogs and entertaining tech memes.
Plus, get access to branded insights such as informative white papers, intriguing case studies, in-depth reports, enlightening videos and exciting events and webinars from industry-leading global brands.
Dive into TechDogs' treasure trove today and Know Your World of technology!
Disclaimer - Reference to any specific product, software or entity does not constitute an endorsement or recommendation by TechDogs nor should any data or content published be relied upon. The views expressed by TechDogs’ members and guests are their own and their appearance on our site does not imply an endorsement of them or any entity they represent. Views and opinions expressed by TechDogs’ Authors are those of the Authors and do not necessarily reflect the view of TechDogs or any of its officials. All information / content found on TechDogs’ site may not necessarily be reviewed by individuals with the expertise to validate its completeness, accuracy and reliability.
Join The Discussion