What Features are Needed for FSD as FSD 12.3.4 is Starting to Release

There are about six major capabilities that Tesla still needs to add to FSD based upon capability checklists. Most of the capabilities will need refinement so that they are performed well every time.

The six capabilities are recognize hand signals, drive in reverse, avoid pot holes and low road obstacles, don’t drive in the blind spots of others, flashing red lights and respond to emergency vehicle sirens.

Tesla FSD needs to recognize hand signals and there is recent video evidence of it doing that some times.

Tesla is starting to release FSD 12.3.4. This should be a refinement of the timid and overly cautious driving in 12.3.3.

13 thoughts on “What Features are Needed for FSD as FSD 12.3.4 is Starting to Release”

  1. I agree, also if you are on the interstate and FSD is set to 70 or whatever when the speed limit drops down like on i 57 dnt know why speed drop, anyway when that happens Fsd hit brakes and the cars behind you almost run into you because of the sudden speed change.

  2. One issue I have noticed here in Illinois is an inability to determine which speed signs are applicable. There are on some tollways and expressways separate limits and signs for general vehicles, buses and trucks so currently FSD jumps to the latest one, which can be manually overridden. More concern is on Cicero Avenue Route 50 it will jump from the 35 mph speed limit to 50 mph not good if you are not supervising attentively. Note route signs like IL 34 don’t do this just those that end in 0 and I expect 5. The signs are all same shape size coloring just the small print this might a real challenge to reach level 5

  3. FSD needs to be able to read signs. It can never get beyond basic level 2 without that. It has to know how to deal with a “no right turn 8am-5pm” sign. Or “no turn on red” or “local traffic only”, etc.

  4. It still needs to remove the NHTSA stop at stoplights, recognize school zone warning lights, and stopped school bus requirements.

  5. Seems to be a data leak with the comment form. I have been seeing it prefilled with someone’s name and email address.

    FSD feels limited by hardware and is currently still far from lvl4 or lvl5.

    • That’s been going on for a couple years now, off and on. I think the system is using cookies to identify you to pre-fill those fields, and somehow misreads them, thinking you’re somebody else. It’s a terrible security hole!

      Brian has always loved having bleeding edge features on his website, and the result is often blood all over the place, figuratively speaking… I don’t know how many times his comment system has crashed, taking years of painstakingly crafted technical comments with it.


      I think the toughest thing on that list is probably responding to hand signals. Especially the informal sort where another driver declines to use their right of way, and signals you to go ahead; That can be subtle enough even humans have trouble with it.

      I’m kind of surprised to see driving in reverse listed as an unsolved problem. I’d expect something like that to be almost trivial. And a feature I’d absolutely want in FSD would be backing up with a trailer.

      • Actually it may make sense that Reverse is one of the last things for them to solve. They’re dependent on real life input to train their models. If you think about it you back up every day, but the distance in measured in feet – not miles.
        There’s more to training the model than just piling up transactions, you need to have variations of actions and outcomes to learn from.
        Backing out of your driveway or parking spot is a short, repetitive journey that doesn’t supply much data for analysis.
        That’s my theory anyway, I’ve only dabbled in AI in other industries, no experience with Tesla FSD.

        • On the other hand, reverse almost always, outside of movies, happens at extremely low speeds, and the dynamics are extremely easy to model. So you’d think it would be low hanging fruit for any algorithmic approach, being very simple and low risk.

          Well, I suppose you’re right, given Tesla’s all neural net approach, you WOULD need actual training data. It’s only a trivial addition for a coded approach.

Comments are closed.