Tesla’s Autopilot has never really been the revolution that users were promised, and the Total Self-Driving (FSD) mode, or Autopilot, was controversial long before it was introduced en masse in the company’s cars.
Now, to make matters worse for the leading manufacturer of electric vehicles, an engineer from Tesla has testified that a 2016 demo video of the car’s self-driving and automatic parking features was a work of fiction. Come on, it was faked, more than the presentation of Steve Jobs with the iPhone.
According to Reuters, Ashok Elluswamy, Tesla’s director of autopilot software, admitted in a statement that the video was fake. The admission appeared to confirm a 2021 New York Times report, in which anonymous Tesla workers admitted that the route followed by the car in the video had been pre-programmed, and that the vehicle had an accident during filming.
Elluswamy said the demo was created after Tesla CEO Elon Musk asked the Autopilot team to design a “demonstration of the system’s capabilities.” However, he claimed that the video did not accurately represent Tesla’s self-driving capabilities at the time.
A security problem that has ended with several deaths
The driver assistance software is fairly standard, but in recent years Tesla has been singled out in connection with several fatal accidents involving the feature. In this house we have discussed the issue on several occasions.
The feature is so controversial that California lawmakers enacted a law preventing Tesla from advertising its cars as fully self-driving until the vehicles are actually capable of autonomous operation.
In Europe, autonomous legislation is very strict, so much so that Tesla has had trouble licensing its technology at the same level as in the United States. In Paris, there were deaths due to a Tesla taxi that went crazy and, since then, all eyes have been on the company Elon Musk.
The Autopilotwhich came to revolutionize the way in which we related to our car and mobility, It has ended up being one more tool that must be watched closely If we do not want to end up involved in an accident, it is missing, depending on the case. There is still a long way to go and that is weighing on Tesla.
Level 2 level of autonomous driving but with profound changes
Autopilot is somewhat of an outlier compared to Tesla’s rivals’ Level 2 autonomous systems. All Tesla variants rely solely on image-based data processingfed by vision cameras located around the car.
This is in contrast to systems deployed by General Motors or Ford, for example., which also incorporate radar into the sensor suite to allow the vehicle to sense its relative proximity to the outside world. Instead, Tesla uses a pair of front-facing cameras and clever image processing to triangulate objects and determine their relative proximity.
Also, Tesla’s Autopilot system has been criticized by industry regulators for its alleged failure to control human driver behavior. Level 2 semi-autonomous systems require full-time supervision of the driving task by a human, even when the system is in complete control of the vehicle’s direction and speed.
In case the system makes a mistake, the person must be able to take control at any time. Many Level 2 configurations also include a monitoring system to ensure the human driver is paying attentiontypically via capacitive touch sensors on the steering wheel or eye-tracking infrared cameras.
Until recently, Tesla only used a torque sensor on the steering wheel to detect the presence of a human hand. Nevertheless, several users reported that this system could be easily tricked by placing an object, such as an orange, so the car stopped reminding drivers to pay attention.
Tesla has a lot of work ahead of it if it wants to differentiate itself from the competition with a truly worthwhile, safe, efficient, and transparent self-driving system. Road deaths can’t come free.