Federal regulators are investigating Tesla’s autopilot mode following a fatal crash that killed a Tesla driver using the software.
The crash occurred on May 7 in Williston, Florida, when a tractor-trailer made a left turn in front of the Tesla at an intersection, according to the National Highway Traffic Safety Administration.
Tesla(TSLA) said in a blog post that its autopilot system did not recognize the white side of the tractor trailer against a brightly lit sky, so the brake wasn’t activated. It also noted that this is the first known fatality in over 130 million miles when autopilot was activated.
“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” Tesla said in the statement. “Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety.”
Tesla called the crash a rare circumstance given how high a tractor-trailer rides above the ground. In the crash the Model S passed under the trailer, with the bottom of the trailer striking the car’s windshield.
The crash comes at a time when national interest is growing in autonomous driving. Experts say self-driving systems could improve safety and reduce the 1.25 million motor vehicles deaths on global roads every year. Many automakers have circled 2020 as a year when self-driving systems will be released on public roads. The federal government is expected to release rules for self-driving vehicles this summer.
Tesla’s autopilot system has been the most aggressive deployment of an autonomous driving system by an automaker.
Tesla’s autopilot allows drivers to take their hands off the wheel and feet off the pedals, assuming most of the driving responsibilities. Drivers are expected to keep their eyes on the road in order to regain control if necessary. However, if a driver isn’t watching the road, there is nothing in the Tesla system that forces them to watch traffic ahead.
Experts have cautioned since Tesla unveiled autopilot in October that the nature of the system could lead to unsafe situations as drivers may not be ready to safely retake the wheel.
If Tesla’s autopilot determines it can no longer safely drive, a chime and visual alert signals to drivers they should resume operation of the car. A recent Stanford study found that a two-second warning — which exceeds the time Tesla drivers are sure to receive — was not enough time to count on a driver to safely retake control of a vehicle that had been driving autonomously.
Tesla’s cars are built with an auto-braking system, however it is not foolproof and did not activate in this crash.