A Tesla engineer confirms that the video showing the merits of Tesla’s autonomous driving would have been fake. A statement that could damage the company’s reputation and change the course of the investigation after the death of Walter Huang in an accident aboard a Model X, on autopilot, in 2018.
Tesla has been working on developing its autonomous driving technology, called Autopilot, for many years. A system that came to the brand’s vehicles in 2015 and as standard in 2019 in all the brand’s new cars. If he was one of the first to propose this device, he was also under fire from critics on many occasions. Last October, the US justice system also launched an investigation, accusing Elon Musk of misleading his clients about Autopilot’s real capabilities.
A forceful revelation
But this isn’t the first time authorities have targeted Tesla. And for good reason, in 2021, an investigation had already been opened by the NHTSA after a series of accidents, including one fatality, that occurred while the autopilot was on. However, in 2016, Elon Musk stated loud and clear that the FSD (full self-driving) system, even a little more advanced, was “ probably better than a human driver and that the person in the driver’s seat is only there for legal reasons. She doesn’t do anything at all. The car drives itself« .
The words are then backed up by a 3 minute 46 minute video showing this device in action. We then see images of the interior and exterior of the car, which seems to perform perfectly, whatever the situation. Unless all this would be completely wrong. In any case, this declares Ashok Elluswamy, director of the division in charge of Autopilot. In a statement broadcast by the edgeIn fact, the engineer claims that this video would not show the actual capabilities of the car, but rather everything that would be possible with this system in the future.
A manipulated video?
In this way, the path of the route would have been pre-recorded and mapped in advance, the car not being able to plan its own trip. how to explain Bloomberg, Elon Musk would have closely monitored the operations, and would have written in an email that it was normal to exaggerate the capabilities, since it is a demo. Remote OTA updates were then planned, to bring these capabilities later, for customers. In addition, Elluswamy specifies that at the time the images were taken, the car was not able to analyze traffic lightscontrary to what is shown.
Already in 2021, former employees of the brand assured that the video was not entirely authentic. In fact, the brand would have hidden an accident that would have occurred during filming by deleting the images. Proof that the Autopilot would not be as efficient as the brand wants you to believe? Especially since for several years, the engineers in charge of the project have reservations about the certainty of Elon Musk about how this technology works.
Another alarming problem
Secondly, Mahmud HikmetHead of the company’s research and development department Ohm Automation highlights another concern. In his testimony, elluswamy claims to have never heard the acronym ODD (Operational design domain). However, this is a crucial element in the development of an autonomous car, as it defines the limits of the use of the autonomous driving system and limits its use at night or in the rain, for example, as in the Mercedes EQS. Which probably means that Autopilot and FSD don’t have ODD and were meant to be used anywhere, anytime.
However, there are currently laws governing the use of this technology. In Europe, autonomous driving level 3 can only be used on the road, at 60 or 130 km/h depending on the country. That is why the Mercedes Drive Pilot, although at level 3, is less advanced than the Tesla FSD, at level 2.
However, Tesla makes it clear in the manual, which specifies that ” Heading Assist is designed for use on roads with limited access ” and that “Failure to follow these instructions may result in damage and serious injuryeven fatal“. Thus, the brand cannot be accused of offering a 100% autonomous system and of deceiving customers.
The main criticism I’ve received from Tesla defenders has been that Ashok Elluswamy, Tesla’s head of autopilot software, not knowing the term “ODD” is a “gotcha.”
I’m going to tell you why this is MUCH worse than you think.
— Mahmood Hikmet (@MoodyHikmet) January 18, 2023
At the moment, in any case, there is no question of offering level 3 autonomous driving to its customers, while the FSD is still in beta testing. Proof that the brand isn’t really sure of its capabilities? However, according to a recent report, Autopilot is becoming more secure. But certifying the system at level 3, as in the EQS, implies many limitations. Because in this case, it is the manufacturer that is responsible in the event of an accident, and the limits of level 3 autonomous driving must be respected, in particular the maximum speed. Tesla doesn’t seem ready to take on these aspects yet.
But if Tesla’s systems don’t have ODD, it’s probably because the brand envisions that they can be used everywhere. In an interview with the investigator. Lex Friedmann Three years ago, Elon Musk mocked the acronym SDG, claiming it was ” Beautifull crazy girl let humans drive cars instead of machines.
However, autopilot seems to have a long way to go, even if this system would prevent 40 accidents per day. For its part, the FSD (Complete self-driving) may have to change its name, since California just introduced a law that prohibits Tesla from referring to this technology as autonomous driving.
Help us build the future of Frandroid by answering this survey!