Tesla Distributed Computing Revenue

Every Tesla car with Hardware 3 has a 144 trillion op computer and Hardware 4 cars have about 400 trillion ops.

Elon Musk has said he will be able to generate revenue for Tesla and Tesla car owners by selling that compute for inference AI.

The compute is there for $400 for Tesla and $400 for each Tesla HW3 owner and $1000 for Tesla for HW4 and $1000 for each Tesla HW4 owner.

The key aspect is creating AI that can solve really hard and valuable problems that will utilize that compute. This would likely be major finance, crypto or revenue generating AI problems.

This is the key innovation that Tesla and Elon have to create with xAI/Grok to solve valuable problems to generate revenue with the distributed supercomputers.

Cern Basher has tried to model the potential distributed computing revenue. The assumption is that companies have valuable problems to be inferred. Distributed Inference Computing for EVs & Bots: also difficult to model, but shows some promise if massively scaled. Most of the opportunity here is with EVs – as they have more compute and more downtime. But the Bots can make up for those short comings if scaled into the hundreds of millions / billions.

3 thoughts on “Tesla Distributed Computing Revenue”

  1. The massive distributed computing power can be used to help the robots simulate alternative solutions and to assimilate their processed experiences (I.E robot dreams).

  2. So – what’s the killer app for massive distributed inference?

    Self driving vehicles need it but they need it real-time so that domain is out. Massive searching and data processing can’t be done because the nodes don’t have data access or enough bandwidth.

    Massive, advanced, almost real-time simulations for bots and self driving perhaps?

    Data bandwidth seems like the bottleneck for whatever app you can imagine. The hw is optimized for video processing but that requires huge bandwidth towards internet unless the video feed comes from the car itself. What external bandwidth do these cars have? Can it be increased somehow without swapping too much hardware?

    Inference is cheap so unless there is a real killer app, I don’t see massive revenue.

    • 5G to cars can be fast if within range of high bandwidth towers, otherwise federated access to terrestrial wifi. Protein folding and certain chemical simulations should be lower bandwidth, so the materials science/biology implications are large.

Comments are closed.