LLM Profitable – Nvidia H200 Are Very Profitable for AI Companies

Nvidia reports that according to their data Nvidia H200s are profitable for AI companies.

$1 in H200 cost can generate $7 in revenue over 4 years serving Meta Llama 3. This means if an AI company buys a $40k H200, the AI company can make $280k in AI over 4 years.

The Nvidia H200 has twice the AI inference capability of the H100.

IF the AI companies are profitable using Nvidia chips the AI companies can continue to buy Nvidia chips.

Nvidia shares have broken above $1000 and they have announced a ten for one split.

Each Nvidia H200 running Meta LLama 3can support about 2400 users by processing 24000 tokens per second.

Nvidia will be moving to liquid cooled AI data centers for the Blackwell data centers.

3 thoughts on “LLM Profitable – Nvidia H200 Are Very Profitable for AI Companies”

  1. Saw a comment by a researcher recently that current neural networks are like the vacuum tubes of AI — something that is a profound milestone and lets us start working with a technology but will soon become a synonym for “primitive” when better architecture comes along. We are still reaping rewards from increases in compute scale but this disguises how much room for improvement there is when we get better systems. It’s just that we don’t know what the analogy to silicon chips for AI will be that will make LLM vacuum tubes obsolete.

  2. The question is:

    Is this a race to the bottom for AI? Will the proliferation of AI bots and agents and services lead everyone to think AI should be free? What happens when reddit and X and every social media company prohibits scraping? Do you trust an LLM trained on Wikipedia? How many ‘News’ sources are merely opinion pieces claiming objectivity?

    Training on data that is artificial has its own set of problems…

Comments are closed.