xAI Will Tap 500 ExaOPS of Compute From the Global Tesla Car Fleet

Tesla builds FSD hardware into all Tesla cars even for those who do not pay to use FSD. This means people can change their mind and sign up for FSD and get it activated. It also means the 144 Trillion operations per second AI will be available to run xAI Grok. There are 5 million Tesla cars globally. Perhaps 60% or 3 million would be available to get paid to run AI computing according to an estimate by Elon Musk. This would be about 500 ExaOPS (8 bit integer operations) based upon the current vehicles. This will likely double every two years based upon increasing annual sales.

Grok is an AI modeled after the Hitchhiker’s Guide to the Galaxy, so intended to answer almost anything and, far harder, even suggest what questions to ask!

Grok is designed to answer questions with a bit of wit and has a rebellious streak, so please don’t use it if you hate humor!

A unique and fundamental advantage of Grok is that it has real-time knowledge of the world via the 𝕏 platform. It will also answer spicy questions that are rejected by most other AI systems.

Grok is still a very early beta product – the best we could do with 2 months of training – so expect it to improve rapidly with each passing week with your help.

X Premium Plus costs $16 per month. ChatGPT plus costs $20 per month.

19 thoughts on “xAI Will Tap 500 ExaOPS of Compute From the Global Tesla Car Fleet”

  1. If Tesla will give me free charges at their public chargers then they can use my car for distributed computing inference. Win-win for me and Tesla.

  2. That’s rather silly. Those models are extremely data hungry. Typically it’s not compute bound but bandwidth bound. How do you feed such system?

  3. Starlink V2 adding direct 4G-LTE connection to unmodified mobile devices means all existing Tesla vehicles could use Starlink for up to a few Mbps connection – which is enough for many functions including some uses of this distributed AI inference compute.

Comments are closed.