NVIDIA CEO Jensen Huang Thinks OpenAI Could Be the Next Multi-trillion Hyperscaler

Nvidia CEO Jensen Huang gives a comprehensive Year in AI Recap and forward-looking discussion on the explosive growth of AI infrastructure, scaling laws, partnerships, competitive dynamics, and geopolitical implications. Nvidia’s recently announced an OpenAI Stargate partnership and an investment in OpenAI. AI computing is an industrial revolution on par with electricity or jet engines. Huang dismisses skeptics on bubbles/gluts and emphasizes Nvidia’s moat through extreme co-design, and advocates for U.S. leadership via pro-growth policies.

AI has evolved from one-shot inference (pre-training + post-training) to thinking AI via chain-of-thought reasoning, tool use, multimodality, and agent systems. This integrates training and inference in reinforcement learning.

There are now three scaling laws.
1. Pre-training: Memorizing/generalizing data (like 8×8=64).
2. Post-training: AI “practicing” skills through iterative inference.
3. Inference: From one-shot answers to prolonged “thinking” (research, ground-truth checks). This potentially scales compute needs by a billion times. Not just 100× or 1,000× as Huang predicted a year ago.

Token generation doubles every few months.
Performance per watt must match to avoid cost explosions.

This drives Nvidia’s annual advanced chip release cadence.
Hopper → Blackwell: 30× perf via NVLink
Then Rubin → Ultra → Feynman next.

Nvidia invests $100B over time in OpenAI. This is optional and is not tied to sales. Nvidia becomes a preferred partner for self-built AI infrastructure.
Nvidia is more critical to OpenAI than Microsoft.
OpenAI’s Stargate: 10GW data centers (additive to existing Azure/OCI/CoreWeave builds, totaling 5-7GW+). If Nvidia supplies, ~$400B revenue potential.

OpenAI transitions from outsourcing (to Microsoft or others) and OpenAI strategy becomes a hyperscaler-like self-build.

AI Productivity

AI augments human intelligence. Human intelligence is now ~$50T of global GDP, 55-65%.

If a $100K employee gets $10K AI → 2-3× productivity then this is hugely valuable

Nvidia’s 100% co-agent coverage boosts hiring/growth.

$10T augmentation → $5T capex (50% gross margins) for AI factories generating tokens continuously versus static software.

TAM Estimate: Current $400B annual. 4-5× growth to $1-2T+ by decade-end/ Alibaba: 10× data center power by 2030. This correlates to Nvidia revenue via watts.

Nvidia’s Moat: Extreme Co-Design and Annual Cadence

Nvidia pace of annual releases would be impossible without internal AI co-design.
Progress has been exponential. Over the last ten years performance from the Kepler→Hopper increased 100,000×.
Hopper→Blackwell: 30× via NVLink).
Roadmap: Blackwell (2025), Rubin (H2 2026), Ultra (2027), Feynman (2028).

Extreme Co-Design is used to optimize model/algorithm/system/chip simultaneously. This is beyond Moore’s Law. There are 6-8 chips variants each year.

There are GPU/CPU/networking/NVLink/Spectrum-X Ethernet. Fastest growing business is the ethernet side of Nvidia. Scales to multi-factory clusters.

Moat Strength is increasing. There is more competition, but it is harder due to scale/wafer costs.
$50B purchase orders on unproven arches only for Nvidia only. Only Nvidia has a proven ecosystem. Supply chain pre-builds $hundreds of billions on Nvidia visibility.

CUDA programmability enables transformer experiments.

ASICsc ompetitors are Limited to niches (like transcoders). Large markets demand customer-owned tooling.

Google TPUs are going to → v7 (version 7). Google still buys Nvidia GPUs. Even free ASICs lose to Nvidia’s 30× perf/watt (2GW power → 2× revenue).
Disaggregated factories. Dynamo open-sourced. NV Fusion integrates Intel/ARM.

Ecosystem and Broader Impacts

Elon/xAI/Tesla. Jensen praises Elon as the ultimate GPU. XAI Colossus 1: 230K H100s/H200/B200. Colossus 2 is 500K GB200s and soon 1M. They are potentially at a 1GW first. xAI investment is incredible and XAI has a full-stack build advantage.

Sovereign AI is an existential need. Nvidia as global infrastructure partner. There is and will be an energy renaissance with nuclear and gas.

The US – Trump admin is pro-growth/energy/tech. They provide an Open-door for CEO access. Exports will accelerate. Reindustrialization/upskilling. AI can be an equalizer. It can close tech divide—no coding needed.

Industrial/digital revolutions accelerated GDP. AI will be co-workers for billions → 4%+ growth.

Abundance age: Raise floor (reindustrialize; AI for all 8B people).

NBF commentary.
OpenAI could go to $2-5 trillion by 2030. Nvidia could go to $8-15 trillion by 2030.
If XAI is the AI winner then it could go to $5-10 trillion by 2030.

2 thoughts on “NVIDIA CEO Jensen Huang Thinks OpenAI Could Be the Next Multi-trillion Hyperscaler”

  1. Feynman: there is no longer plenty of room at the bottom for transistors. When I retired, you could count the atoms in gate oxide. Now, you can count the atoms in the channel

Comments are closed.