Tesla announced that its Autopilot self-drive system was activated moments before a fiery and fatal crash involving one of the company’s Model X crossovers. The crash occurred near Mountain View, Calif., when a Model X careened directly into a barrier separating two lanes of highway traffic. The driver, reported to be a 38-year-old employee of Apple, was the sole fatality, though two other vehicles were involved. The entire front end of the Model X was destroyed in the impact, and an intense fire broke out following the collision.
In its statement, Tesla reported that Autopilot had been turned on and the driver might have ignored warnings to take control of the vehicle.
“Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” read Tesla’s updated blog post regarding its crash investigation. “The driver had received several visual and one audible hands-on warning earlier in the drive, and the driver’s hands were not detected on the wheel for six seconds prior to the collision.”
Tesla further stated, “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
In the seconds leading up to the crash, according to Tesla’s own findings, data recorders indicate there was no attempt to brake or steer the vehicle out of harm’s way.
Autopilot is meant to be a driving aid, not a completely autonomous system. Using a combination of cameras, radar and laser sensors mounted on the car’s exterior, a Tesla can monitor the road, stay in its intended lane of travel and, while working in conjunction with the adaptive cruise control, maintain a safe distance from the car ahead. Drivers can take their hands off the steering wheel for short periods of time but, after a few seconds, a series of visual and audible warnings starts, asking drivers to place their hands back onto the wheel.
In its first report, Tesla’s blog post made a point to mention the highway crash barrier had been severely damaged just prior to this collision. Photos posted online show the barrier in its original state — as seen via Google Street View — and in a much shorter form the day prior to the fatal Model X crash. The lack of an adequate crash structure could have made the accident worse. Apparently dismissed by Tesla is the possibility that the vehicle’s Autopilot sensors did not adequately monitor, then adjust, the vehicle’s driving behavior for the shorter length of the highway divider.
As for the fire following the collision, Tesla stated the “battery packs are designed so that in the rare circumstance a fire occurs, it spreads slowly so that occupants have plenty of time to get out of the car. According to witnesses, that appears to be what happened here as we understand there were no occupants still in the Model X by the time the fire could have presented a risk.”
Feds Still Investigating
Despite its relatively quick online responses, not everyone is pleased with how Tesla has handled the accident investigation. The Washington Post reports the National Transportation Safety Board is “unhappy” that Tesla announced its own findings ahead of the NTSB’s ongoing research into the causes behind the crash. According to the report, the NTSB investigation will take a matter of weeks before it publishes its findings.
While the behavior of an Autopilot system might be called into question once again, it’s worth noting the 2018 Model X was recently awarded a five-star safety rating by the National Highway Traffic Safety Administration, the highest possible score. However, Tesla is known to be very vocal when it disagrees with less favorable crash test scores. That was the case last year, when the Insurance Institute for Highway Safety gave the Model S sedan an “acceptable” rating in the notoriously difficult small front overlap crash test, which simulates a front collision with only part of the vehicle’s width. Tesla questioned the “methods and motivations” behind the test — though the automaker never explained why the nonprofit safety agency would single out the Model S.
Accidents involving self-drive systems have been thrust into the spotlight following a deadly crash involving an autonomous Uber test car in Tempe, Arizona. The crash occurred when the self-driving Uber failed to stop or swerve around a pedestrian crossing the road outside marked crossing lanes. Onboard cameras show the Uber’s drive system did not react in time; meanwhile, the safety engineer behind the wheel was looking down, not at the road ahead.
Widely considered to be the first fatal crash involving a fully self-driving vehicle, further information about its causes could be difficult to ascertain. Uber recently reached a settlement with the victim’s family, according to Reuters, to help avoid a public court case that could further damage the ride-hailing company’s research into autonomous cars.
Cars.com’s Editorial department is your source for automotive news and reviews. In line with Cars.com’s long-standing ethics policy, editors and reviewers don’t accept gifts or free trips from automakers. The Editorial department is independent of Cars.com’s advertising, sales and sponsored content departments.