Self driving cars must be completely self driving because people will not back up

Tesla CEO Elon Musk (indirectly) made a striking admission yesterday: the autonomous driving features his company recently launched are too dangerous. On an earnings call with investors, Musk said that “additional constraints” will be added in response to evidence that people have pushed the feature too far. “There’s been some fairly crazy videos on YouTube,” he said. “This is not good.”

It has been well documented that people have both intentionally and accidentally tested the limits of Tesla’s new feature (see “Drivers Push Tesla’s Autopilot Beyond Its Abilities”). But while some individuals have clearly been reckless, Tesla bears some responsibility as well due to the way it has designed and deployed its system, as Musk seems to realize.

Musk didn’t mention any specific “constraints” that will be added to make the autonomous driving feature safer. One obvious upgrade would be to require that someone be sitting in the driver’s seat. As this video of a Tesla driving itself with no one at the wheel on a private road shows, the system requires only that the driver’s side seatbelt be clicked in, even though the driver’s seat has an “occupancy sensor.”

Restrictions like that may not be enough if Google is right about the relationships that form between humans and autonomous cars, though. One reason Google invented a new car design that lacks a steering wheel was that long-term tests of conventional SUVs modified to drive themselves showed that people quickly became dangerously detached from what was going on around them. When the car needed them to take over because it couldn’t handle a particular situation, they weren’t ready