Tesla FSD and 4680 Batteries Huge Impact for 2022-2024

David Lee interviewed AI and self-driving expert James Douma again. James previously worked on the Autopilot system at Tesla.

David and James discussed some information that Elon Musk provided during an interview Elon had with Lex Fridman.

Elon said FSD Beta 11 would be one stack to rule them all. Currently, Tesla FSD (full self-driving) is on beta version 10.8. 10.8 is getting generally very strong reviews. Tesla FSD has had different neural net stacks for various aspects of the driving. One neural net for driving in highways another for driving on city streets and another for driving in parking lots. For at least 6 months, Tesla has wanted to merge all of them into one neural net for all of the driving. It would be better to not have handoffs between neural nets which can cause errors and problems. However, Tesla could only make the transition to one stack if this would not cause a large drop in reliability and performance for the FSD. Apparently, FSD Beta version 11 will be when Tesla’s single neural net will outperform the separate neural nets.

Elon also said FSD version 11 converts all the systems to work with surround video instead of vector space.

Before, they had a feed from each of the cameras.

Then they merged the images from all of the cameras.

Then they convert the video into a surround video and then this is converted into a bird-eye view video view.

These surround video views and bird-eye views simplifies the views and makes it more efficient.

This is now done with neural networks and less with C code.

The outputs are not points but the neural net output will be objects (cars, people, other objects).

Elon Musk has far stronger technical understanding of AI than any other CEO.

Tesla FSD is also converting to use raw photons and photons counts instead of image processing. This enables the AI to go beyond the limitations of human vision to the complete capabilities of the cameras. A camera can view 1000s of shades of gray but people can only see 32 shades of grey. This means an image that is dark to humans an AI could interpret the grey and still see cars and people. The cameras could also see shades of white to still see cars and people in what appears to be a white out image.

Elon said the FSD timeline is likely Level 4 by end of 2022. This is probably Tesla looking at internal metrics and running a line through the graph of changing metrics and looking at where the system would crossover known human capability.

Tesla Dojo progress? James expects that Tesla likely has cabinets of the Dojo system working. This is because they have the tile working at the demo. Worst case they may not have the linkage system and networking at full speed.

Tesla Bot expectations. James says that Tesla will have to make motors and actuators that are suited for Tesla bots. James expects that they will have the motors and actuators working fairly well enough for an impressive demo at the end of the year. However, there will be a lot of work after that to make a useful helper bot.

Tesla 2022 – what James look for? James feel the ramp of the 4680 batteries in 2022 will be the biggest impact for 2023. Tesla Energy will ramp explosively once the batteries are not constrained. James also believes there will be strong utility firm demand will enable margin on energy to grow. James thinks the battery ramp could enable Tesla energy to ramp to the size of Tesla cars in 1 to 2 years once the 4680 batteries ramp.

SOURCES= Dave Lee Investing, Lex Fridman

25 thoughts on “Tesla FSD and 4680 Batteries Huge Impact for 2022-2024”

  1. Hm.. Has he? I know that during AI-day, using movies rather than single frames was touted as the solution to "flickering" objects. It was unclear, however, how much of that had been implemented and how much was in the "pipeline".

    And it would seem that cars are still "flickering" in and out of existence in the FSD visualization. Particularly as they are occluded. So it would seem that the FSD has neither uses multi-frames nor spatial memory. Yet.

    And then you have the statements the Elon just made during his interview with Lex Friedman. I interpreted them as the FSD SW would start to use "movies" rather than single frames "soon".

  2. True, but that is now what I am after here. To me, it seems that the decisions are very bad and not improving. The perception, by contrast, is improving.

  3. That's how I interpreted Lex's and Elons discussion. Plus, cars flicker in and out of existance when drivers use FSD. This flickering should be impossible with object permanence.

  4. Does anyone have an update on solid state batteries relative to the 4680s? Is solid state really happening, say in 2022, from Toyota, QuantumScape, etc? Why couldn't Dyson and that battery company they bought make it work?

    Also pouch batteries, like GM is using, seem interesting and maybe smarter than a bunch of tiny cylinders (even chunkier tiny cylinders like the 4680). I wonder if solid state and pouch batteries have any implications for making a structural battery pack, any advantages or disadvantages compared to cylinder lithium ion batteries.

  5. At this point, Tesla should deprioritize everything else other than battery production, and perhaps semiconductors their inverters/converters use(power electronics). Everything else would be easier to outsources, or they could "buy premanufactured". Body by Toyota/Honda powertrain from Tesla, FSD open source.

    Sounds about right to me.

  6. Anyone who knows anything about horses knows that the introduction of even early, friction shoe braked, unwieldy, cart wheeled horseless carriages was an improvement in road safety.

    And now that I think about it, it is possible that the current image of cars being uniquely dangerous compared to huge, flighty, barely controlled animals, might just be a cultural hangover from when automobiles were the "new, advanced, untrustworthy technology" and so the news media were trumpeting every accident they caused.

  7. As I said, there've been a variety of attempts at quieter muscles.

    Some interesting work: https://pl.linkedin.com/company/clonerobotics

    https://wyss.harvard.edu/news/artificial-muscles-give-soft-robots-superpowers/

    I'm guessing that the main thing that has kept them mostly out of robots is imprecision and lack of repeatability. Factors like rate of wear are important as well.

    Tesla might take the approach of dynamically calibrating every muscle in a robot, so they know how much power input currently produces what degree of flexure. That'd give them decent precision, when combined with realtime hand-eye coordination.

  8. Artificial muscles and soft robotics has been in heavy development for a long time.

    It's far more likely that Tesla will take some since discarded approach and commercialise it.

    Scientists are always searching for better ways to do something and seldom actually go out of their way to affect a commercial venture to exploit a technology, especially if they already have inspired themselves to create something better from the fruits of their current research.

  9. Part of the problem is also the eye of the media is constantly on automated driving incidents.

    Every single accident big or small gets at least 1000x higher priority in the media news cycle than one caused by humans and therefore it puts a high pressure on the manufacturers to achieve what may be impossible standards even though a switch to everyone using what they already have would probably cut accidents down to levels unheard of since before the days of the horse drawn carriage on roads (they were death traps at least as much as cars).

  10. Any links to tweets/interviewers discussing this subject?

    Just interested about the subject of ML object tracking/path prediction which is often ignored in favor of object detection in talks.

  11. I'd like to see Tesla find a better approach than electric motors to actuate their robot.
    Something that doesn't whine or buzz as it moves. There's been a variety of attempts at quieter "muscles".

  12. Yes, public acceptance does not just need less crashes than humans make* it also needs to never have crashes that a human thinks are insane.

    eg. A human can't keep track track of hundreds of accounts, let alone thousands. If a computer tracked 10 thousand but forgot a few then most people would be OK with that. But if the computer sends out demands for payment of a debt of $0.00 then people laugh and call it a joke system.

    *And not just better than the worst humans, or even better than average. Most humans consider themselves way above average, so won't accept a system that isn't up to their own self-perceived standard.

  13. Tesla just recalled 350000 cars for defective trunk lock that could lead to the forward trunk opening while driving. I think that for a mix of good and bad reasons there is a lot of hype around tesla. The company is surely innovative, but discussing about FSD when a simple faulty lock can cause a serious accident is grotesque. The worst part however is that NBF covered recalls from other car manufacturers in the past, but since this one impacts the Tesla's narrative it has been ignored.

  14. An issue here is that, even when the self-driving system has achieved a lower accident rate than humans, it's likely going to have those accidents under different circumstances than humans, leading to WTF reactions when humans look at the accidents.

  15. In fact, Musk has outright stated that their system does include object permanency, and extrapolations about what objects that are temporarily obscured will do.

    But they're expecting to improve that with the next release.

  16. In his interview with Lex Friedman, Elon Musk seemed unconcerned with the C-part of the SW that governs planning. But as a layman, it would seem that planning and decisions play a huge role in the crazy decisions that the SW sometimes makes.

    I just saw a video of how the SW decided to driver over a curb that it did detect. And many times the car will neither do right nor left turns even when the crossing traffic would allow it. It waits for a "tap" on the gas by the user.

    Presumably, it's difficult to write a rule based system that can deal with all of these cases, but it should be easy with an ANN. Furthermore, Tesla has a massive amount of data, so they could easily train a new decision/planning network.

    Note that the input data into this network would be vastly smaller than what goes into the detection network. Instead of hundreds of millions of pixel values per second, it would be a *much* smaller set of vector space variables. So training should be fast and easy.

  17. If I understand things correctly, the FSD beta does not yet incorporate multiple time frames; it's still object detection frame by frame. And we know that using consistiency over time gives a huge boost in perception reliability. So there is a really "low hanging" fruit to picked right there in the near time. Same with memory of objects in space.

  18. It is very difficult to judge the actual progress of the beta FSD SW. All reviewers always mix in subjective points such as "much smoother", but none – as I have seen – just give us a number. This time I had 4 interventions in 10 miles, last time it was 3 interventions in 6 miles.

    And then we have Galileo's crass statement that driving FSD beta is generally more work than utility, i.e. the SW is so unreliable that it's just easier to drive your self without sitting on high allert in case the FSD SW would make a dangerous manouver.

  19. Actually, I give the Musk credit where he deserves, but first and foremost, I am a realist at the most brutal sense.

  20. The Musk is going into deep cycles of High expectations and disappointments from Tesla's self drive efforts…

Comments are closed.