Tesla Deemed Partly Responsible for Self-Driving Death

By: Bridget Clerkin September 25, 2017
An NTSB report says Tesla is partly to blame in the first death involving a self-driving car.
Share This Page
Share Pin It Email Print

The 2016 death of a Tesla test driver at the wheel of an auto-piloted Model S is at least partly the fault of the Silicon Valley carmaker, a new federal report declares.

The bold statement, issued last week by the National Transportation Safety Board (NTSB), reverses an earlier determination by the National Highway Traffic Safety Administration (NHTSA) expunging the company—and its experimental technology—from any blame for the incident. The NHTSA report, issued shortly after the crash, instead named human error as the sole culprit for the deadly accident.

Still, that version of events doesn’t account for the fact that Tesla is knowingly selling a car with questionable autopilot systems, which can be easily abused by drivers, according to the NTSB. The independent federal agency is charged with investigating crashes involving planes, trains, and a number of other vehicles.

The crash rocked the autonomous automobile world last May, as the first self-driving incident to result in a death. Tesla test driver Joshua Brown was behind the wheel when the car, operating in its driver-assist mode, smashed into the side of a semi-truck at a Florida intersection, reportedly due to the system’s difficulty distinguishing the truck’s long, white trailer from the sky.

While the NTSB report also laid some of the burden on the truck driver’s failure to yield, it noted that an “overreliance” on Tesla’s semi-autonomous system also contributed to the accident. The car was traveling at 74 MPH at the time of the crash, driving through a 65 MPH zone, and certain safeguards that should have prevented Brown from utilizing the autopilot system were “lacking,” the report says. (Still, the agency reported that Brown received 7 warnings from the autopilot system concerning his continuous use of the hands-free driver-assist mode, which remained on during 37 of the trip’s 41 total minutes.)

Whether Tesla will be penalized in the wake of the investigation is unclear—and unlikely. The NTSB report doesn’t recommend any disciplinary actions, but it lists several suggestions for preventing similar incidents in the future, including the development of applications that more closely assess the level of a driver’s engagement—and more strongly interfere when it’s determined a driver is not paying enough attention to the road.

The agency’s call to adapt more slowly to self-driving technology may be unique within the federal government, which has done much to fast-track the development of the vehicles. According to NTSB Chairman Robert Sumwalt, precaution should still be paramount when taking a driverless ride.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” he said in the report. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong.”

Recent Articles