Tesla Autopilot is a suite of advanced driver assistance features that include lane centering, traffic-aware cruise control, self-parking, automatic lane changes, limited semi-autonomous navigation, and the ability to summon the car from a garage or parking spot. These features, although impressive, rely on supervision by the driver and are meant to reduce accidents caused due to driver negligence or fatigue. They are not a replacement for responsible driving.
However, with Tesla aiming to offer full self-driving (FSD) soon, Tesla accidents are analyzed extensively to determine whether the fault lies with aggressive automation. As of May 2021, there have been 7 recorded fatal crashes involving the Tesla autopilot. Its impact has been suspected in many other incidents as well.
The most recent has now occurred near Fontana, California, on 5th May at 2:40 am. A Tesla Model 3 crashed into an overturned truck on a highway, killing the Tesla’s driver and injuring the truck driver and a motorist. The National Highway Traffic Safety Agency (NHTSA) is opening a safety probe into the fatal Tesla crash amid growing concerns about the automaker’s driver assistance systems. It has not been confirmed whether the Tesla Autopilot was being used when the impact took place.
According to a San Bernardino County Coroner’s report, the driver was 35-year-old Steven Michael Hendrickson from Running Springs, California. Hendrickson frequently posted to his social media accounts to show off the technical features of his Model 3.
This fatal crash occurring just a month after the one in Texas that killed two is undoubtedly concerning. Tesla has been under scrutiny for the AutoPilot after a Tesla Model S crashed in suburban Houston with the crash causing a stock drop of 3.4%. The crash proved controversial as local law enforcement officials who had rushed to the site had reportedly not found a body in the driver’s seat. One body was found in the front passenger seat while the other was in the back. This had prompted reports that this Tesla may have been in autonomous mode, making many question the safety standards of the technology. However, the owner’s home security cameras captured one of the victims entering the driving seat.
Tesla’s Chief Executive, Elon Musk, claimed that the Autopilot couldn’t have engaged in the crash conditions. The NTSB seemed to vet this statement when they tested a similar Model S on the same road and found it impossible to engage Autosteer on the road where the crash occurred.
Regardless, the NHTSA has opened 29 special investigations into Tesla crashes by now, out of which 25 are still underway, and at least four have occurred since March 2021.
While it is not yet confirmed if the system has been at fault each time, many have cast doubts about letting a machine do the driving. Tesla needs systems to ensure that the driver’s attention is guaranteed even when Autopilot mode is engaged. Elected officials and senators seek to make driver monitoring systems mandatory for vehicles that feature automated driving systems like Tesla’s Autopilot and FSD, GM’s Supercruise, or Ford’s Copilot 360.
On its website, Tesla warns that the driver assistance systems it offers do not make their vehicles fully autonomous and that active driver supervision is still necessary. The internet, however, is filled with videos that show moving Teslas with drivers asleep or without their hands on the wheel for extended periods. Param Sharma, in specific, has been arrested twice for the same and yet claims to have no qualms about letting the Tesla drive itself again. Till drivers remain irresponsible in such ways, a driver monitoring system is not just advised but necessary for safety in semi-automated vehicles like the Tesla.
Comments