Neighbor News
Curses and Crashes: Tesla Settles in Autopilot Lawsuit
It was supposed to be the future where a computer did everything and passengers could ride without a worry in their heads.

It was supposed to be the future: effortless driving where a computer did everything and passengers could ride without a worry in their heads.
Then small accidents began to occur. A fender bender here, a significant impact there, until the first autopilot-related death occurred two years ago. In 2016, 40-year-old Joshua Brown’s autopilot feature failed to distinguish a white 18-wheeler truck from the bright spring day.
With one of the most recent incidents including the death of Tesla enthusiast Walter Huang, consumers and bureaucrats alike have been shouting about some major concerns.
Find out what's happening in Georgetownfor free with the latest updates from Patch.
Scenes like this led to the class action lawsuit against the company, which was finally settled May 24th.
What Was the Lawsuit?
“Class action lawsuits provide a way for large groups to seek recompense for injuries,” says an attorney with Newsome Melton law firm. “They are frequently used with defective products.”
Find out what's happening in Georgetownfor free with the latest updates from Patch.
The class action against Tesla involved complaints that the semi-autonomous driver assist system in its Model X and Model S SUVs was “essentially unusable and demonstrably dangerous.”
Users claimed that Tesla misrepresented the safety devices on its website, assuring users it would make driving safer. The site states, “All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.”
However, consumers complained that the program was largely inoperable and faulty enough to cause more harm than good.
What Is Causing the Crashes?
For the most part, the issues seem to revolve around several major points: a lack of driver awareness, inept data and sensors misreading environmental factors.
As Tesla clearly points out about its autopilot program, drivers must continue to watch the road. In fact, sensors and warnings will sound if drivers remove their hands from the wheel. According to Tesla, this is exactly what occurred in the most recent fatality. The victim, Huang, had been given several warnings to replace his hands on the wheel. Six seconds prior to the crash, his hands were not detected.
Inept data is also playing a role. The program relies on software updates that relay data taken from owners’ vehicles. It’s a process called fleet learning; as more data is uploaded, the program becomes safer.
However, software updates have been slow in coming according to consumers of the lawsuit, and a single bit of misinformation can lead to dire consequences.
In Huang’s case, it the divider between the highway had been removed or crushed without being replaced; the car didn’t know and headed directly for it.
Sensors reading incorrectly are another problem altogether; in situations like Brown’s, it can mean they do not “see” oncoming hazards.
Tesla Settles
In an effort to “do right” by its customers, Tesla settled out of court. Class members will receive anywhere from $20 to $280 in compensation.
With any new technology comes learning curves. Until then, however, it’s best we all keep our eyes on the road.