There is a lot of controversy about the Tesla Safety Monitor in the Tesla Austin Robotaxi rollout. The Tesla safety monitor has access to buttons (pullover or stop in lane) which are two forms of emergency stop.


On a train there is the emergency stop button or lever. Everyone on the train can hit or pull the emergency stop. Do we call everyone on the train supervisors? No, they are all still passengers. There can be extra train employees on the train beyond the conductor. And those other employees have access to emergency stop. Everyone also has call buttons for support intercom to the conductor. These are for regular trains and public transit.
Bus passengers can hit buttons to request a pullover.

The terminology is still passengers in all bus and train cases even if employee or not employee. No one calls them supervisors or drivers because of access to emergency stop.
These terms and capabilities have existed for decades and are common throughout the world. Just because this is robotaxi and tesla people are getting confused and biased.

Also, look at the picture of the safety monitor in the passenger seat. They would have to climb over the divider which is 8 inches above there seat. A larger adult would have great difficulty climbing over the divider in a timely way. In most cases, you would have unbuckle and get your feet onto the seat and then step over into the driver seat. Someone in the back seats would need to try to head first dive into the passenger seat or again get feet onto their rear seat and step over to the driver seat.
Economics of the Safety Monitor
Waymo and all other robotaxi companies (a dozen in China and Asia ) and a dozen in North America have had $1-3 billion per year development programs. The revenue from robotaxi has been at most $30-50 million per year for Waymo. 8 million miles paid for Waymo in 2024 and maybe 18 million miles for 2025.
Tesla has also been paying a billion per year. but Tesla has made a billion dollars by selling Autopilot (1 million) and FSD (500K).
the safety monitor will only be in there for 1-4 weeks. It is being extra cautious. Tesla already has 300+ test drivers and staff working vehicles in the field. They did not hire people specifically to be safety monitors. It is like when amazon had amazon Go where they had no cashier stores. They had 12 extra people at each 7-11 like store for the purpose of checking equipment and software. It is people checking customer service and technical issues for a specific rollout. It is a question of where do they deploy 300 people already on staff who were already testing the FSD software. That spend and those staff already existed and have existed for years for FSD and autopilot. I would say the FSD/robotaxi development and testing staff will always be there except maybe 5 years out after teslabot is at scale and perfected. But then the $1-2 billion per year development and testing staff will be meaningless because there will $500+ billion per year in revenue from 10 million cars and trucks as robotaxi and robotrucks.

Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
You might want to read Dan O’Dowd’s review on X: https://x.com/RealDanODowd/status/1937325824321216875
It’s pretty damning and too much to summarize here. But:
– Reuters reported that Tesla deployed 10 Supervised Robotaxis in Austin yesterday. 10 Supervised Robotaxis committed two safety critical errors in half a day of driving. That’s four safety critical errors in a full day.
– A post from Ashok Elluswamy, Tesla’s VP of AI, showed that Tesla’s Supervised Robotaxis had driven about 500 miles. In these 500 miles, the Supervised Robotaxis made two known safety critical errors when they drove on the wrong side of the road and phantom braked. The Supervised Robotaxi only managed approximately 250 miles between safety critical errors.
– Tesla has spent months preparing for this. Despite Elon Musk saying in 2019 that “If you need a geofence you do not have real self-driving”, the Supervised Robotaxis in Austin are geofenced. Despite Elon saying in 2019 that HD maps were a “crutch” that “should not be used”, Tesla has mapped the geofence. Elon even admits that the Supervised Robotaxis are avoiding difficult intersections in Austin. An image from Tesla’s control room showed Tesla has steering wheels to control the Supervised Robotaxis remotely.
Not ready for prime time. I wouldn’t get into one and would be hesitant to walk, bike, or drive near one too.
Dan O’Dowd has his own software company selling competing software to Tesla. Asking Dan about Tesla is like asking Sam Altman of OpenAI to rate XAI Grok.
Waymo just blocked traffic going the other direction and in an intersection. This just happened.
https://x.com/nextbigfuture/status/1937557962727588136
The Tesla was considering turning earlier than it was supposed to but it kept moving and corrected after moving about 2 feet in the wrong part of the lane. It did not cross more than a 20% of the vehicle into the wrong lane and it did not make any other vehicle swerve or stop. There was no safety critical error because safety critical means if it had continued without intervention then an critical accident would have occurred. The Tesla software corrected itself in two 0.5-1 second actions. The Tesla had two errors that were not safety critical and they were not even interventions.
Waymo had wrong way error where the vehicle was completely in the wrong way and the Waymo was at times stopped and the other traffic was blocked.