Tricking Waymo, Cruise and Tesla Driving Systems

Youtuber Mark Rober made a video sponsored by Luminar Lidar (valuation about $170M-200 Million) which used a large billboard sized picture of the road ahead to supposedly causes Tesla Autopilot to fail. Reviewers online have claimed that Tesla FSD was never engaged and was Full Self Driving was never claimed to be used. It was Autopilot and that was often not used. There are also issues of the timing of when Tesla systems were engaged or disengaged.

Mark Rober never says he was testing Tesla Full Self Driving. It is implied by the video title but it is never claimed or shown in the video. Rober was using Autopilot and Autopilot was often not engaged.

Here are some claimed mistakes by Mark Rober.
Mistake #1: during the fog test, we see an alert on the screen, at this time of impact, suggesting the accelerator was being pressed.

Mistake #2: during the water test, he is clearly driving in the middle of the road, where AP is not engaged.

Mistake #3: During the fake wall test (zero chance you encounter this in reality btw), the editor left in a few frames showing the display at point of impact. Again we see AP was not engaged.

There were many instances where protestors against the leading lidar using self driving robotaxi (Waymo, Cruise) were disabled using $20 traffic cones placed onto the vehicles.

A youtuber who has tested Waymo and Apollo Go. She found Apollo Go to be vastly inferior to Waymo.

Tesla FSD customers and influencers in China are now using Tesla FSD and making videos of the Tesla systems handling difficult traffic and weather. Complex intersections and high density of Pedestrians, bikers and scooters can overwhelm Lidar reliant driving systems. Those conditions are common in China.

Robotaxi systems (Waymo, Cruise, and Chinese companies like Apollo Go) are all only able to operate in a few hyper mapped geofenced areas. Tesla FSD is operating with their human drivers supervising (but almost always not touching steering wheels or pedals) all over the USA, Mexico, Canada and China.

There are youtubes of Tesla FSD now in China handling far more challenging situations.

The following situations commonly cause LIDAR robotaxi or LIDAR driver assist failures:

Complex Intersections
Robotaxis can falter at intersections where multiple lanes merge or diverge, especially when human drivers act unpredictably (e.g., sudden lane changes or not signaling). The system may hesitate or make incorrect decisions, disrupting traffic flow.

Construction Zones
Road layout changes due to construction confuse LIDAR systems, which rely on pre-mapped data. Temporary detours or lane closures can lead to hesitation or the need for human intervention, as the robotaxi struggles to adapt to unmapped conditions.

Adverse Weather Conditions
Heavy rain, snow, or fog impairs LIDAR sensors by scattering laser beams, reducing their ability to detect obstacles accurately. This can result in the system disengaging or misjudging distances, posing safety risks.

High Pedestrian Density
Areas crowded with pedestrians, bikers, and scooters overwhelm the system’s ability to track and predict movements. Hesitation or incorrect path predictions can occur, especially in urban settings where erratic behavior like jaywalking is common.

Unmarked or Poorly Marked Roads
Roads lacking clear lane markings or signage challenge LIDAR systems, which depend on these cues for navigation. Without them, the robotaxi may disengage or struggle to maintain proper lane discipline.

Emergency Vehicles
The unpredictable, high-speed behavior of emergency vehicles (e.g., ambulances or fire trucks) can confuse robotaxis, leading to hesitation or failure to yield appropriately, potentially delaying critical responses.

Traffic Situations Causing LIDAR Driver Assist Failures in China
In China, LIDAR-based driver assistance systems (used in semi-autonomous vehicles) face similar challenges, intensified by the country’s unique traffic conditions. Common failure scenarios include:
Crowded Streets with Mixed Traffic
Chinese cities often feature high volumes of pedestrians, cyclists, and two-wheelers (e.g., e-bikes and scooters) alongside cars. This mixed traffic can overload the system’s tracking capabilities, leading to disengagement or errors.

Unpredictable Traffic Behavior
Sudden lane changes, jaywalking, or vehicles ignoring traffic signals are prevalent in some areas. These behaviors make it difficult for the system to anticipate and react, often requiring human intervention.

Poor Road Conditions
Potholes, uneven surfaces, or debris can affect LIDAR sensor accuracy, particularly in detecting road boundaries. This can cause the system to misinterpret the driving environment and fail.

Complex Urban Environments
Narrow streets, dense pedestrian crossings, and informal road usage (e.g., street vendors or parked vehicles blocking lanes) confuse the system, leading to hesitation or disengagement.

Here are many videos showing the Tesla FSD handling challenging situations and the brittleness of the Lidar driving systems.