Tesla Fined $243 Million For Autopilot Crash


Tesla was found partially liable for the role its Autopilot system played in a fatal 2019 crash in Florida Friday. A jury ruled the automaker’s technology was to blame for the crash that killed a woman and severely injured a man; it ordered the company to pay $243 million in damages, according to a report by the Washington Post.
Related: Teslalternatives: What Should You Buy if You’re Tired of Your Tesla?
The driver was using Autopilot on his Tesla Model S when it left the road and struck the couple; he told police that he took his eyes off the road to pick up a dropped cellphone. The jury ruled that Tesla’s driver-assist technology enabled the driver to momentarily take his eyes off the road and then failed to warn him the road was ending.
Prosecutors argued that Tesla, specifically citing interviews with CEO Elon Musk, acted recklessly in overstating the technology’s abilities, as well as created confusion about its limitations. It also argued that Tesla allowed Autopilot to function on roads it’s not designed for.
Safety and regulatory agencies agree on the confusion and the disconnect between some systems’ claims and their actual abilities. According to a recent study from the Insurance Institute for Highway Safety, many drivers misunderstand the capabilities and limitations of semi-autonomous systems, such as Tesla’s Autopilot and Full Self-Driving, and have a misplaced reliance on the system. IIHS also called out versions of Tesla’s Autopilot and Full Self-Driving Beta systems specifically in a 2024 test of partial-automation driving systems and gave the systems poor ratings across seven different parameters.
This isn’t the first time the safety of the company’s autonomous systems has come into question. A recall in early 2023 affected 362,000 Tesla vehicles equipped with the Full Self-Driving Beta system. According to the National Highway Traffic Safety Administration, the system enabled the cars to drive straight through an intersection in a turn-only lane, enter an intersection with stop signs and not come to a complete stop, and enter an intersection during a steady yellow traffic signal without “due caution,” among other unsafe driving examples.
Such a question won’t be the last. Tesla is currently being sued in several cases across the country for similar safety allegations related to its autonomous driving systems. The automaker said it plans to appeal the ruling in the Florida case.
More From Cars.com:
- Tesla Recalls 46,000-Plus Cybertrucks for Detaching Trim
- 239,000-Plus Tesla EVs Recalled for Rearview Camera Issue
- Tesla Recalls 694,000-Plus Vehicles Over Tire Pressure Monitoring System
- Study: Partial Driving Automation May Be Teaching Drivers Wrong Behaviors
- Find Your Next Car
Cars.com’s Editorial department is your source for automotive news and reviews. In line with Cars.com’s long-standing ethics policy, editors and reviewers don’t accept gifts or free trips from automakers. The Editorial department is independent of Cars.com’s advertising, sales and sponsored content departments.

News Editor Jennifer Geiger joined the automotive industry in 2003, much to the delight of her Corvette-obsessed dad. Jennifer is an expert reviewer, certified car-seat technician and mom of three. She wears a lot of hats — many of them while driving a minivan.
Featured stories

Should Tesla Model Y Owners Get the New 2026?


2026 Nissan Leaf Review: Value Victory

