Tesla’s Hidden Compute Power

I did a video with Brighter with Herbert, Herbert Ong.

I explained why the chart from last year showing Tesla going to 100 Exaflops of compute in October 2024 is out of date.

Elon Musk said that both XAI and Tesla had over 30,000 H100 chip equivalents. Nvidia H100 chips each have petaflops of compute.

If Tesla has over 30,000 Nvidia H100’s then Tesla has 120 Exaflops of compute already.

There was also word that the wait time for Nvidia H100s is down to six weeks. This means Tesla has or will spend $3 billion to get about 100,000 H100s. This would be 400 Exaflops of compute. XAI is going to buy a total of 100,000 Nvidia H100s to get the 400 Exaflops needed to train an OpenAI GPT-5 class Grok 3 model.

Before the end of the year, Tesla would likely buy and install at least 20,000 B100s. This would be 20,000 times 20 petaflops of compute. This would be another 400 Exaflops combined with the 400,000 Exaflops of 100k H100s for 800 Exaflops of compute.

Tesla and the other AI companies also need a lot of fast cache memory. The training AI needs to view of the thousands of compute chips and exabytes of cache memory as one continuous area.

Here I have tables for how we can expect the compute, memory, data needs and energy to scale for each of the major AI training centers.

The data needs are huge and the primary sources of additional data are real world video like what is gathered by $250 billion (6 million) Tesla cars with their 8 cameras and teraflop driving inference chip.

Elon said that with GPT5 and Grok 3, the models have run out of regular text, image and video data. This is about 40 trillion tokens worth of data. Getting more data would involve real world video, real world audio and synthetic data. Synthetic data is AI extrapolating data its has with statistically similar data.

The scaling will be so large that if the value for the world is making better and better SuperAI, then civilization could organize itself around making more and better chips, with faster and bigger fast memory to make better AI products like self driving cars and humanoid bots. Those self driving cars and humanoid bots would have cameras to record and learn from things happening in the world and to send that information back to further improve the AI.

10 thoughts on “Tesla’s Hidden Compute Power”

  1. Your article hits on an important part that has been hinted at by some.

    Training data is king(outside of energy). With wells all used up (book, websites, etc)
    Whoever has new sources will lead.
    YouTube video, tiktok (video), Tesla (car video, bot video), X (text)

    Elon was thinking long game buying twitter gaining a data training source. (Value of much of that is questionable, but still).

    Tesla will highly likely start buying X data when bot starts to need to speak and understand humans.

    Tesla will also have a leg forward on energy front. Probably start deploying their own commercial solar and battery storage.

  2. You to say if you were Living at the time of the exodus and witnessing the making of the Golden Calf?

  3. Tesla wants to stay in AI race. They will focus on self driving. Microsoft, Meta,… will buy more H100’s but it is not just compute!! Better coding and better learning algorithms are important. Human has so much less resources available and he is still so much better at many things.

  4. Sounds like Elon needs to send his upcoming fleet of Starships out to mine asteroids to build space based solar power to run orbiting compute nodes linked by lasercom. Makes an interesting model for future Starlink if it’s not just telecom but cloud compute and cloud AI. Especially if you can run a two-way high bandwidth connection to a tablet or laptop anywhere on earth.

  5. Oh no, you’re not starting to do that thumbnail thing with the massive faces, are you? Some marketing type told you you’re not doing as well with Gen Z as you could be, did they?

  6. Clearly, this technology needs to be much optimized with regards to energy efficiency because power generation can’t ramp up at this speed.

    All this power is directed to try and emulate a bunch of humans that consume about 110 Watts per person on average. (all body functions)
    That is 38500 kWh for one person over 40 years including all training and ongoing inference.

    How many persons does it take to hold all relevant knowledge?

  7. Great interview ! If these projection hold out there is going to be a massive build out of solar and wind to power all that computation. This could have the effect of temporarily raising the cost of electricity until the build out is complete.

    • Thats an interesting thought.. future sci fi movie plotline? Seems more likely AI will be interested in hunting for energy to power itself. Perhaps thats where AI goes rogue!!! it scours the earth etc for any power sources it can find

Comments are closed.