Tesla Car Owners Will be Talking to Their Cars in Weeks

xAI has released Grok 3 which is the first model to break a 1400 LLM Arena score. This makes Grok 3 the top AI model.

Grok 3 is still undergoing reinforcement learning to make it even smarter. This will help Grok 3 to surpass OpenAI O3.

The data center that xAI has built has been expanded to 250 megawatts and 200,000 GPUs.

xAI is already expanding to 1.2 gigawatts of power and 1 million GPUs. Those will be Nvidia B200 chips and Dojo 2 chips of similar compute power. A million more powerful GPUs will have 40 times the training power.

The big economic impact for Tesla will having a great conversational AI voice system out in a week. They will stick it into Tesla cars in 1-2 months. The voice system will also be in Teslabots.
Talking to xAI grok 3+ in Tesla cars will be a huge convenience and will be very useful for drivers.
It will also get deeply integrated with X and X payments
Agent grok will come out in a few months to perform very useful tasks and work based upon voice commands.

AI expert, Andrej Karpathy Summary of Grok 3 . As far as a quick vibe check over ~2 hours this morning, Grok 3 + Thinking feels somewhere around the state of the art territory of OpenAI’s strongest models (o1-pro, $200/month), and slightly better than DeepSeek-R1 and Gemini 2.0 Flash Thinking. Which is quite incredible considering that the team started from scratch ~1 year ago, this timescale to state of the art territory is unprecedented. Do also keep in mind the caveats – the models are stochastic and may give slightly different answers each time, and it is very early, so we’ll have to wait for a lot more evaluations over a period of the next few days/weeks. The early LM arena results look quite encouraging indeed. For now, big congrats to the xAI team, they clearly have huge velocity and momentum and I am excited to add Grok 3 to my “LLM council” and hear what it thinks going forward.

1 thought on “Tesla Car Owners Will be Talking to Their Cars in Weeks”

  1. Brian, from where do you get that there will be a grok 3 in Teslas within weeks? Is hw4 even suitable for running LLMs?

Comments are closed.