Tesla accused of misrepresenting Autopilot in Florida crash lawsuit

The plaintiff's Model S had Autopilot enabled when it hit a stationary car on the highway.

Andrew Krok Reviews Editor / Cars
Cars are Andrew's jam, as is strawberry. After spending years as a regular ol' car fanatic, he started working his way through the echelons of the automotive industry, starting out as social-media director of a small European-focused garage outside of Chicago. From there, he moved to the editorial side, penning several written features in Total 911 Magazine before becoming a full-time auto writer, first for a local Chicago outlet and then for CNET Cars.
Andrew Krok
3 min read

Tesla has once again found itself in court following a collision involving its Autopilot driver-assist system.

Shawn Hudson, represented by the Morgan & Morgan law firm, has filed suit against Tesla in Orange County, Florida, seeking damages related to an automobile crash on Florida's Turnpike. The court document filed by Morgan & Morgan alleges that Tesla "has duped consumers" into believing Autopilot is more capable than the automaker claims.

Here's what happened. Morgan was driving his on Florida's Turnpike (State Road 91) between his home in Winter Garden and his job at a Nissan dealership in Fort Pierce. He relies on Autopilot to reduce the tedium of his 125-mile commute, but as the Model S approached a disabled vehicle in the left lane, it kept going and collided with the disabled vehicle, destroying the Tesla's front end and leaving Hudson with "severe permanent injuries," according to the complaint.

In addition to the counts of liability, negligence and misrepresentation filed against Tesla, the plaintiff also brought a count of negligence against the owner of the disabled vehicle for leaving it in the roadway.

 "In this case, the car was incapable of transmitting log data to our servers, which has prevented us from reviewing the vehicle's data from the accident," a Tesla spokesperson said in an emailed statement. "However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed."

The statement also stressed that driver vigilance remains paramount. "When using Autopilot, it is the driver's responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn't make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not."

Enlarge Image

Hudson's Model S suffered severe front end damage after colliding with the disabled Ford Fiesta.

Morgan & Morgan

To that end, the owner of the Model S did admit to the Orlando Sentinel that he was looking at his phone periodically while Autopilot was enabled. "But never do I trust the car 100 percent, so I was looking up, looking down, looking up, looking down, and I'm looking up and a car's disabled in the passing lane on the Turnpike," Hudson told the Sentinel. Florida law permits taking phone calls while driving, but texting while driving is illegal.

Mike Morgan, the Morgan & Morgan attorney representing Hudson, criticized Tesla's statement in a phone call to Roadshow. "To me, that [response] is just puzzling," Morgan said. "If that's how [Autopilot] is intended to be used, and how it's supposed to function, they have a huge problem."

The complaint alleges that the sales representative Hudson encountered at the Tesla store he visited overstated Autopilot's capabilities, namely that it could "allow the vehicle to drive itself from one point to another with minimal user input or oversight." Morgan said as much in his phone call with Roadshow, but he believes the company has mischaracterized its capabilities up and down the ranks, from how the automaker arranges its promotional materials to how the sales staff pitched it.

This is not the first lawsuit to make this claim. Tesla was sued in Utah in September after a driver's Model S crashed into a fire truck stopped at a red light. Autopilot was enabled at the time, and the driver was allegedly under the impression that the vehicle would stop for all blockages in the car's path.

Following some high-profile crashes featuring Autopilot, Ars Technica ran a feature on why Autopilot and other driver-assist systems (like autobrake) keep missing stationary objects in a car's path. Long story short, systems are designed to ignore stationary objects on fast roadways on purpose -- long stopping distances and false positives could actually make an automatic panic stop on the highway more dangerous for other drivers on the road. The feature also points out that this line of thinking "no longer makes sense" because drivers can be lulled "into a false sense of security and cause them to pay less careful attention to the road."

Faster, safer and more efficient: See how 5G will change the auto industry.

Recalls: Stay up to date with our coverage of recalls across the industry.