CARS.COM — The National Transportation Safety Board concluded that “operational design” permitted a driver to rely too much on self-driving features in a Tesla crash some 16 months ago. The Florida collision killed the Tesla driver, 40-year-old Joshua D. Brown, when his 2015 Model S hit a semitrailer crossing the road in front of it.
NTSB cited driver error — including the truck driver’s failure to yield to the Model S — as a factor in the crash, including overreliance on self-driving features. But the agency charged that the Model S by design allowed “prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.” NTSB went on to criticize too-few safeguards for Tesla’s Autopilot system, which can center the car in its lane and accelerate and brake on its own.
Tesla maintains that Autopilot is not a self-driving system. Still, the way it intuits driver involvement — primarily through steering interactions — is ineffective at ensuring driver engagement, NTSB said.
The agency recommended to federal regulators and automakers that any cars with self-driving features do the following:
- Record data and maintain it in a standardized format.
- Include safeguards to limit such features to areas where they’re designed to be used.
- Incorporate ways to verify those safeguards.
- Develop better ways to assess driver engagement.
- Report all incidents involving crashes for cars with self-driving features.
In response to the report, Tesla said Autopilot “significantly increases safety,” citing a January finding by the National Highway Traffic Safety Administration that the system’s lane-centering steering feature reduces accident rates by 40 percent.
“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” the automaker said in an emailed statement to Cars.com. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”
Today’s meeting comes three months after NTSB released a 538-page docket with preliminary information gathered by investigators on the May 2016 crash. The agency found, among other things, that Autopilot’s adaptive cruise control and lane-centering steering were engaged when the Model S struck the trailer at 74 mph.
The driver should have seen the trailer crossing “at least seven seconds prior to impact,” the agency said at the time, but the lane-centering steering system had only detected the driver’s hands on the wheel for a combined 30 seconds or so during the 37.5 minutes of driving prior to impact. At the time, Autopilot would issue warnings for the driver to keep his or her hands on the wheel after one to five minutes of inaction, eventually disengaging if the driver didn’t comply. But activities like adjusting the cruise control could reset the timer even if they didn’t involve any steering. Indeed, two minutes before the crash, vehicle data indicate Brown adjusted cruise-control speed, NTSB said.
Shortly after the crash, Tesla said that Autopilot, which relies on a collection of radar, camera and ultrasonic sensors, failed to identify the white trailer against a “brightly lit sky.” Since the crash, the automaker has updated Autopilot to reduce maximum hands-off time before warnings begin to escalate. If you ignore them altogether, the system now disables Autopilot for the rest of your trip. (Tesla issues over-the-air software updates to its vehicles through an onboard internet connection, so it can update how features work for new shoppers and existing owners alike.)
“Automation in highway transportation has the potential to save tens of thousands of lives,” NTSB Chairman Robert L. Sumwalt III said in a statement. But “until that potential is fully realized, people still need to safely drive their vehicles.”