Tesla Navigation on Autopilot for 99% of 800 Miles of Real-World Highway Driving

A pro-Tesla analyst indicates that he used Tesla Navigation on Autopilot for a 13-hour 800-mile drive from Seattle the Bay Area. He found it very relaxing and it is a completely different experience than highway driving with the first version of the Autopilot.

27 thoughts on “Tesla Navigation on Autopilot for 99% of 800 Miles of Real-World Highway Driving”

  1. If autopilot is or becomes as safe or safer than human drivers on average then it will be advisable to use it instead of driving yourself

  2. hmm – previous reply vanished?
    No, this was a Tesla, March 2018. It hit a concrete divider.

  3. It’s getting better but not quite there yet. I’m only sounding ‘sceptical because I need to kerb my enthusiasm. True 100% auto-ubers will potentially usher in the end of ‘car-as-product’ and deliver ‘transport-as-a-service’. Unless you’re a trades person running your own business, you’ll never have to buy a car that sits depreciating in your driveway 95% of the time. Most city folk will just use their phone and book a regular robot-mini-van to the local train station — or even all the way to work — whatever is cheaper. It will get cars off the road and solve traffic problems, and even improve how we design our cities. Once you remove the expense of the taxi-driver’s salary, the cost of hiring a car drops to maybe 10%. It will change everything. But as the following video shows, we’re not there yet.
    https://electrek.co/2019/01/28/tesla-autopilot-snow-storm/

  4. Your question about the speed limit is interesting, on some roads Tesla will keep you within 5 MPH of the limit, on other roads the limit is decided by the driver. I think interstate highways are set at the 5 over. I know 5 over isn’t enough to keep up with most traffic.

    Passing trucks on a divided highway is not an issue. Currently the driver has to acknowledge that they want to pass. Wiggle the wheel and the car completes the task. I’ve never tried to let the car pass on a two lane road, and I don’t think it would.

  5. This, you have two options against something avoid or brake, you might do both however.
    Exception you also has the more drastic option to drive the car out of the road and into the ditch. It have saved me once or twice.
    This however is not so relevant on highways.

    And yes you have two issues, one is not realizing an danger, the second is panicking as an plastic bag blows over the road.

  6. Yes. There are youtube videos of Teslas driving with fairly heavy snow falling and normal lane markings hidden – search for “tesla autopilot snow”.

    I would guess that in snowy conditions the probability of autopilot failing and not realizing it, is higher.

  7. I’m referring to the one in California in March 2018 where a Tesla hit a divider. Also, I think the autopilot lane following got confused in that case, where the road split.

    The driver was not paying attention and didn’t return attention to the road quickly. But if I were the judge I would still find Tesla at least partially at fault. Their autopilot “knew” there was a problem and had sufficient information to at least minimize the accident by braking, but instead apparently chose to maintain highway speed for 5 seconds leading up to the crash. And that’s giving it the benefit of the doubt by assuming that traffic in adjacent lanes prevented it from shifting left or right to avoid the obstacle it had spotted.

    I hope Tesla has fixed that by now.

  8. I believe you’re talking about the case where the Uber car hit that bag lady. As I recall, the car did have the capacity to stop built into it, and “knew” it needed to, but Uber management had that function locked out because they were having the cars false positive on braking too often.

  9. the current process is pretty difficult to overlook. The car just does not carry on on its own if it is worried or can’t handle the situation. It screams at you. It really is something one should experience, it will impress you.

  10. Tesla still claims this is a Beta condition. So let’s leave that there.
    However, I can testify as to driving long trips on highways in the desert Southwest. I don’t think a manual trans or racecar suspension is going to make that trip better. I can view the scenery, try to find a radio station, put eye drops in, while the car follows the car in front or pioneers through the deserted landscape. It’s pretty nice.I do alert myself in conditions the car isn’t very sophisticated in.
    Not that big a deal.

  11. Yesterday, I drove from Bend, Oregon to Berkeley, and I had to go around dozens of trucks. Some were on four lane highways, where I had to change lanes, other trucks were on two lane highways, where I had to cross over the middle line. I have driven from Seattle to the bay area numerous times and 99.9% is on either freeways or four lane roads. How did the Tesla pass trucks? Also did the Tesla follow the flow of traffic or the speed limits, which are rarely obeyed on highway 5?

  12. Have you tried it? I think the appeal is the reduced fatigue. I imagine it would be amazing in traffic, considerably less stressful and probably much safer.

  13. I also drive a manual but I live close to work. I don’t have daily freeway stop and go traffic to contend with. Manual shifting is a pain in stop and go traffic.

  14. I think a far more critical situation would be if the autopilot does not realize that it is about to do something wrong. In such a case, the driver would get NO warning – like when that Tesla drove under a semi-trailer because it apparently couldn’t see it pulling in front of the car.

    However, there was that other accident where the autopilot apparently recognized that a crash was going to happen 5 seconds before it did, and signaled the driver to do something – but they did not.

    In that situation the autopilot should have immediately started slowing and flashing its emergency lights, rather than simply continuing to drive on at full speed and wait for the driver to act. That would likely have gotten the driver’s attention a lot quicker than any warning sound, and would have reduced the seriousness of the accident if not avoided it.

  15. Brush fires smoke along side of the road (late summer of 2017 I95S a stranded white test lidar vehicle was standed in Carolina’s ). also jackknifed semi along with Semi brake parts left on the road are other issues to be worked out. Also does these chips used in gpu/cpu have and export restrictions?

  16. Ha ha, reminds me of “Throw mumma from the train” where he’s writing and has a mental blank about choosing “The night was hot” or “The night was moist.” The night turned out to be sultry, and the rain snow thing turns out to be sleet. 😉

  17. Tesla is still 6 months from the target for Feature Complete FSD, so that’s pretty decent for regular release AP. Once Feature Complete FSD is out in beta for Tesla Network everybody will probably get a chance to see exactly what gaps are as it tries to act as a Robo Taxi under the owners supervision. That will be the intense final stretch of fleet learning before genuine FSD.

  18. That would be worse than driving yourself. Because suddenly getting surprise control handed to you will be much more dangerous than being in control all the time.

    Honestly, if you can’t sleep or watch videos then what is the attraction?

    THough, I’m someone who likes to drive a manual gearbox, even in the city, so I’m clearly not in the majority.

  19. 99% does not cut it. Student drivers are often better than that. You really need that 100% or a way to safely stop when some very low probability thing happens and the computer has an issue. And no kind of road or road feature should present any sort of issues. If there is an issue, the only kind that can can be forgiven is enforced detours, road work, and acts of God. And these things should be reported as soon as these irregularities present themselves, so other cars can be told what to expect or what to avoid.

    In Los Angeles, there are major traffic accidents every day on the freeways. Following the directions of police, staying behind a zigzagging police car, recognizing road flairs, anticipating people on foot crossing lanes in the aftermath of accidents is critical. Less critical is knowing what to do when a road is flooded and you are directed to go around. There it can probably just find somewhere to park, and maybe someone can take over remotely from the company…or give the car the appropriate commands.

  20. Agreed. The issue of how much is good enough is a complex and vexing one. Are they talking about situations where the AP lost control or did not know with enough confidence weather it is doing the correct thing, or did it do the incorrect thing? It is not clear to me.
    But then, when you drive the same thing most certainly happens to you as well there are seconds when you are confused, hold wrong judgement/conclusion from your senses, or make an incorrect decision then moments later you recover, you don’t even record or remember that event unless it was a scary close shave, while AP does record.

  21. A headline like this without an explanation of the 1% is pretty worthless.

    Maybe the 1% was parking lots.

    Maybe the 1% was emergency maneuvering to avoid being killed by AP.

  22. So maybe 16 times there was a 30 second stretch where the nav system was going to crash home at high speed. Since a bad decision only takes a new seconds I suppose it could be as many as a hundred times? Very relaxing

  23. A drunk can successfully handle 99% of highway driving. The problem is knowing in advance when that 1% is going to happen. Far enough in advance to wake the driver, and bring them up to situational awareness.

    Because a self-driving car isn’t worth squat if you have to be hands on the steering wheel and paying attention all the time anyway. You might as well be driving yourself.

Comments are closed.