Will Self-Driving Vehicles Make Uber A Dangerous Choice?

Ride-sharing companies Uber and Lyft are a convenient alternative to calling a cab. Once users download the company’s app and register an account, they can start booking rides. Although ride-sharing services can be more expensive than a cab, it’s what people want.

In an attempt to cut costs and increase profits, both Uber and Lyft have launched self-driving autonomous vehicles into their ride-sharing fleet. Large corporations like Google and Ford all plan to launch self-driving taxi services at half the cost of Uber.

Although Google’s autonomous vehicle technology (Waymo) seems to be ahead of the curve, their cars aren’t perfect. One of Waymo’s cars crashed when the safety driver fell asleep at the wheel and disabled the self-driving software by touching the gas pedal. If it’s that easy to make a potentially fatal mistake, there is definitely cause for concern.

Another concern is the rate at which driverless cars are being sideswiped and rear-ended. Wired.com published a chart detailing 49 self-driving crashes in 2018.

After all the kinks are worked out, autonomous cars are projected to reduce the 40,000 fatalities on U.S. roads each year. Until then, there have been and will continue to be accidents and unfortunately, fatalities.

Safety drivers don’t make self-driving cars safe

Autonomous driving technology has a long way to go before cars can be let loose without safety drivers. However, the presence of a safety driver doesn’t make an autonomous vehicle safe. So-called safety drivers often get too comfortable and let their guard down, losing the ability to respond quickly. For instance, in March 2018, a self-driving Uber vehicle in Arizona – with a safety driver present – killed a pedestrian walking a bicycle across the street. The safety driver was streaming a TV show on her phone instead of watching the road.

Too late to make a difference, Uber’s safety driver swerved, but failed to brake until after impact. Reports show the car’s sensing technology had identified the pedestrian six seconds prior to impact – long enough for the car’s automatic braking system to engage as the car approached the pedestrian. However, Uber admittedly disabled the automatic braking feature to prevent a jerky ride. Still, the courts found Uber not to be criminally liable. Uber has suspended testing of autonomous vehicles.

There are no laws on the books that govern self-driving technology

The incident in Arizona is believed to be the first time a pedestrian has been killed by a self-driving car. State and Federal laws don’t govern self-driving technology, so Uber was let off the hook. However, the safety driver’s toxicology report came back positive for marijuana and methamphetamine. AZ Central reported that officials found the safety driver to be “inattentive” and her “disregard for assigned job function to intervene in a hazardous situation” contributed to the crash. While the outcome of the case has yet to be published, it’s clear the safety driver might have some liability for the incident, but it’s not enough.

Ride sharing is a largely unregulated industry with a large number of safety concerns. Passengers and drivers have been assaulted, kidnapped, and even murdered. Adding self-driving cars into the mix just adds further risk.

It’s time for new liability laws across the nation for ride-sharing

Some cities, including Chicago, are aggressively pursuing the regulation of ride-sharing due to the risks posed to passengers. For example, many insurance policies don’t provide coverage when a vehicle is being used for commercial purposes. Yet, Uber and Lyft don’t provide insurance to their drivers. This leaves injured passengers potentially unable to recover compensation. Imagine if the incident in Arizona resulted in injury rather than death. Who would be held liable to pay the injured party’s bills? Uber, or the safety driver?

A person already puts their life at risk when getting into a ride-sharing car. When that rideshare is an autonomous vehicle, with or without a safety driver, the risk increases.

Self-driving taxis are the future, like it or not

Currently, a person’s chance of getting an autonomous vehicle as a ride share is slim. However, once self-driving cars go mainstream, and autonomous taxi fleets take to the streets, it will be impossible to know if the taxi called will be driverless. When it shows up, you’ll have to take it or leave it. Technology goes where it goes, and eventually, all will be called to adopt it or be left behind.

12 thoughts on “Will Self-Driving Vehicles Make Uber A Dangerous Choice?”

  1. If there is a problem when strict obedience to the law gives different driving behaviour than a good human driver: maybe change the laws to make them correlate with reality?

    Reply
  2. In the Autonomy Day video, Musk did not fully address the case of the passenger in an autonomous Tesla that still had a steering wheel grabbing the steering wheel and causing an accident. (3:20:00 ish) I felt the question was never asked exactly right. However, it seemed like a gap was left that Musk had not addressed. Unless the steering is drive-by-wire (?), or the power assist could be used to override such movements, the best that could be done might be to have the computer fight for control, and if it failed, to pull the car over. I suppose some liability might point to the passenger (self-destructive/fearful/paranoid, whatever). Any insights into this scenario?

    Reply
  3. “once self-driving cars go mainstream, and autonomous taxi fleets take to the streets, it will be impossible to know if the taxi called will be driverless. When it shows up, you’ll have to take it or leave it.”

    That doesn’t follow at all – very likely it will be an option at the time you call for a cab. You’ll probably end up paying quite a bit more for a human driver, if nothing else, so even those who don’t mind SD taxis will insist on knowing.

    Reply
  4. The article linked to with the chart of 49 collisions fails to indicate how many miles the SD cars had driven, and whether the rate per mile was actually all that bad compared to human driven cars. I’m guessing it actually is worse, but the article fails to make that case.

    Reply
  5. Reading through a bunch of the California self-driving accident reports, the rear-endings generally happened when the SD car was stopped for some reasonable reason. But it sounds like it actually had stopped, then moved forward as if it was going, but stopped again.

    Possibly they are observing the law on where to stop before turning, whereas human drivers expect anyone who plans to turn right (for example) to edge forward before stopping where they can see if it is safe to proceed, and not stop again.

    While this needs to be ‘fixed’, the article makes it sound like this is evidence that the SD cars are endangering humans – when in fact about the highest speed of collision was 5mph, with 2mph more typical, with typically minor damage and no injuries. I think I read one case with a speed a bit over 30mph.

    More drastic accidents did occur – but many or most of those seemed to happen when a human driver did something dangerous nearby, and the safety driver took over but did something nearly as dangerous. That doesn’t speak well to the idea that drivers can be expected to take over when something goes wrong.

    Reply
  6. The article seems to hold the position that, when comparing human-driven ride sharing with fully self-driven ride sharing (with no human safety driver), the risks to the passenger only increase (see this sentence: “Adding self-driving cars into the mix just adds further risk.”).

    That seems not to take into account that the risks of the various forms of the human driver attacking the passenger are completely gone when there is no human driver. It easily could be that the total amount of risk to the passenger is higher in the fully self -driven ride sharing case, but that is not what the article says.

    I make no claim about which case currently presents the higher risk to the passenger. I’m only saying that ignoring that the risks the human driver poses to the passenger are not eliminated in the fully self-driven case is poor analysis.

    In the long run, when self-driven cars overcome their current limitations and greatly out-perform human drivers, it seems clear to me that fully self-driven ride sharing will present lower total risk to the passenger. I have no idea how long it will take for self-driving technology to reach that level.

    Reply
  7. The fact that you wrote an article shitting on self driving literally days after Tesla laid out in detail how they will have it solved by the end of next year is strange, to say the least.

    Reply
  8. TL;DR – Self driving cars should be limited to California for the years required to make them safe before they are allowed near any innocent victims.
    Fortunately the USA has a whole state that doesn’t contain any people that anyone else would miss.

    Reply

Leave a Comment