National Highway Traffic Safety (NHTSA) is opening a new investigation into Tesla’s Autopilot semi-autonomous driving system after a Tesla vehicle crashed into a fire truck, killing the driver. This is not the first time that the manufacturer has had to deal with the authorities on this issue.
Is Autopilot really as reliable as Tesla wants us to believe? Logically we have the right to ask the question. Because for several years, the cases related to this semi-autonomous driving system, which arrived in 2015 in the Model S, have been increasing. AND the recent accident that took place on February 18 the latter in Contra Costa County, California doesn’t help matters.
a new survey
A few days ago, firefighters were responding to a traffic accident, with their truck parked in the middle of the road to block access and secure the area. When suddenly a Tesla crashed into the rescue vehicle, instantly killing its driver while the passenger was removed and taken to hospital. Four firefighters were also evacuated.
If at the moment nothing indicates that the autopilot was activated at the time of the accident, the NHTSA (National Highway Traffic Safety Administration), the equivalent of Road Safety in the United States, still wants explanations from Tesla. and for cause, The organization has already spent two years investigating a dozen accidents having involved emergency vehicles with the brand’s electric cars.
Slow down and move when approaching emergency vehicles. Truck 1 was struck by a Tesla while blocking the lanes of I-680 from a previous accident. Driver declared dead on scene; the passenger was extricated and transported to the hospital. Four firefighters also transported for evaluation. pic.twitter.com/YCGn8We1bK
— With Fire PIO (@ContraCostaFire) February 18, 2023
In fact, the level 2 semi-autonomous driving system would tend to deactivate when approaching traffic accidents or simply not seeing fire trucks parked on the side of the road.
. If Tesla had operated a remote update it was supposed to correct the problem in 2021, as recalled automotive news, so it looks like this one didn’t really work. However, be careful, the police do not yet know if the autopilot was active at the time of the accident.
Remember that the manufacturer has decided to update its Autopilot, removing the sensors so that it depends solely on the cameras. A system called Tesla Vision considered more efficient by the brand.
If the manufacturer claimed that its semi-autonomous driving system would prevent approximately 40 accidents a day, this is not entirely the opinion of the NHTSA, which has launched several investigations. The latter has just ordered the withdrawal of more than 362,000 cars from the brand, due to a failure of the Autopilot, which would then not respect the Highway Code and that would behave dangerously.
Among them, phantom braking, well known to Tesla owners, whose car jerks to a halt for no apparent reason. A phenomenon that is not specific to the brand’s vehicles, but that would be common in them. So much so that some customers alerted the authorities, while Elon Musk claimed to have fixed the problem. A new blow to the reputation of the brand, recently damaged by the revelations of an engineer who claimed that the video promoting Autopilot in 2016 was fake.
In April 2021, we also showed that Tesla’s published figures on the efficiency of its self-driving system were, in fact, slightly skewed. A few months ago, the US court also accused Elon Musk of misleading his clients about the real capabilities of his Autopilot, when he had said in 2016 that ” the person in the driver’s seat is only there for legal reasons. She doesn’t do anything at all. The car drives itself« .
Either way, though, the latest data shows that Teslas are safer than other cars, with less risk of having an accident.
In June 2022, the threat of a system ban hung over the brand, as the NHTSA opened an investigation after an accident involving a car of the brand whoseThe autopilot would have disengaged a second before impact. Is this a way for the firm to acquit itself, by claiming that its system was not active during the impact and that it is therefore the responsibility of the driver involved?
Be that as it may, and if this device is far from perfect, it is advisable to wait for the results of the various surveys before deciding. However, a few weeks ago, during the Super Bowl final, The Dawn Project association issued an ad against Tesla, accusing Autopilot of causing accidents. It was she who also claimed that she hit the children, which was denied by several experts.
The manufacturer is currently preparing a new version of its software, Hardware 4, which will debut on the Cybertruck and which could make autonomous driving and, in particular, FSD (full self-driving) more efficient and safer.
Do you use Google News (News in France)? You can follow your favorite media. Continue Frandroid on Google News (and Numerama).