• Home
  • News
  • Personal Finance
    • Savings
    • Banking
    • Mortgage
    • Retirement
    • Taxes
    • Wealth
  • Make Money
  • Budgeting
  • Burrow
  • Investing
  • Credit Cards
  • Loans

Subscribe to Updates

Get the latest finance news and updates directly to your inbox.

Top News

The 8 Best Legit Sites for Getting Free Samples

January 26, 2026

Degrees Are the Past, Skills Are the Future: How to Win the 2026 Skills-First Job Market

January 26, 2026

5 Tricks To Make Your Bills More Predictable

January 26, 2026
Facebook Twitter Instagram
Trending
  • The 8 Best Legit Sites for Getting Free Samples
  • Degrees Are the Past, Skills Are the Future: How to Win the 2026 Skills-First Job Market
  • 5 Tricks To Make Your Bills More Predictable
  • Winter Savings Very Few People Use, But Everyone Qualifies For
  • 5 New Ways Advertisers Are Tricking You in 2026
  • 5 Real-World Job Roles That Will Dominate Hiring in 2026
  • The 15 Best Cities in America for Composting and Limiting Waste
  • Trump’s Latest Idea Could Save Homeowners Thousands on Their Taxes
Monday, January 26
Facebook Twitter Instagram
FintechoPro
Subscribe For Alerts
  • Home
  • News
  • Personal Finance
    • Savings
    • Banking
    • Mortgage
    • Retirement
    • Taxes
    • Wealth
  • Make Money
  • Budgeting
  • Burrow
  • Investing
  • Credit Cards
  • Loans
FintechoPro
Home » Nvidia unveils H200, its newest high-end chip for training AI models
News

Nvidia unveils H200, its newest high-end chip for training AI models

News RoomBy News RoomNovember 13, 20235 Views0
Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email Tumblr Telegram

Nvidia on Monday unveiled the H200, a graphics processing unit designed for training and deploying the kinds of artificial intelligence models that are powering the generative AI boom.

The new GPU is an upgrade from the H100, the chip OpenAI used to train its most advanced large language model, GPT-4. Big companies, startups and government agencies are all vying for a limited supply of the chips.

H100 chips cost between $25,000 and $40,000, according to an estimate from Raymond James, and thousands of them working together are needed to create the biggest models in a process called “training.”

Excitement over Nvidia’s AI GPUs has supercharged the company’s stock, which is up more than 230% so far in 2023. Nvidia expects around $16 billion of revenue for its fiscal third quarter, up 170% from a year ago.

The key improvement with the H200 is that it includes 141GB of next-generation “HBM3” memory that will help the chip perform “inference,” or using a large model after it’s trained to generate text, images or predictions.

Nvidia said the H200 will generate output nearly twice as fast as the H100. That’s based on a test using Meta’s Llama 2 LLM.

The H200, which is expected to ship in the second quarter of 2024, will compete with AMD’s MI300X GPU. AMD’s chip, similar to the H200, has additional memory over its predecessors, which helps fit big models on the hardware to run inference.

Nvidia said the H200 will be compatible with the H100, meaning that AI companies who are already training with the prior model won’t need to change their server systems or software to use the new version.

Nvidia says it will be available in four-GPU or eight-GPU server configurations on the company’s HGX complete systems, as well as in a chip called GH200, which pairs the H200 GPU with an Arm-based processor.

However, the H200 may not hold the crown of the fastest Nvidia AI chip for long.

While companies like Nvidia offer many different configurations of their chips, new semiconductors often take a big step forward about every two years, when manufacturers move to a different architecture that unlocks more significant performance gains than adding memory or other smaller optimizations. Both the H100 and H200 are based on Nvidia’s Hopper architecture.

In October, Nvidia told investors that it would move from a two-year architecture cadence to a one-year release pattern due to high demand for its GPUs. The company displayed a slide suggesting it will announce and release its B100 chip, based on the forthcoming Blackwell architecture, in 2024.

WATCH: We’re a big believer in the AI trend going into next year

Don’t miss these stories from CNBC PRO:

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Articles

RSS Feed Generator, Create RSS feeds from URL

News November 22, 2024

X CEO Linda Yaccarino addresses Musk’s ‘go f—- yourself’ comment to advertisers

News November 30, 2023

67-year-old who left the U.S. for Mexico: I’m happily retired—but I ‘really regret’ doing these 3 things in my 20s

News November 30, 2023

U.S. GDP grew at a 5.2% rate in the third quarter, even stronger than first indicated

News November 29, 2023

Americans are ‘doom spending’ — here’s why that’s a problem

News November 29, 2023

Jim Cramer’s top 10 things to watch in the stock market Tuesday

News November 28, 2023
Add A Comment

Leave A Reply Cancel Reply

Demo
Top News

Degrees Are the Past, Skills Are the Future: How to Win the 2026 Skills-First Job Market

January 26, 20260 Views

5 Tricks To Make Your Bills More Predictable

January 26, 20260 Views

Winter Savings Very Few People Use, But Everyone Qualifies For

January 26, 20261 Views

5 New Ways Advertisers Are Tricking You in 2026

January 24, 20261 Views
Don't Miss

5 Real-World Job Roles That Will Dominate Hiring in 2026

By News RoomJanuary 24, 2026

Editor’s Note: This story originally appeared on Monster. In 2026, one thing is clear: Employers…

The 15 Best Cities in America for Composting and Limiting Waste

January 23, 2026

Trump’s Latest Idea Could Save Homeowners Thousands on Their Taxes

January 23, 2026

The No. 1 Retirement Haven in Europe in 2026

January 22, 2026
Facebook Twitter Instagram Pinterest Dribbble
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact
© 2026 FintechoPro. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.