Multiple Tesla lawsuits claim drivers are at fault for Autopilot accidents


SAN FRANCISCO — As CEO Elon Musk bets Tesla's future on self-driving, lawyers from California to Florida are painstakingly dissecting the company's most popular driver-assistance technology. They claim that Autopilot is unsafe for widespread use by the general public.

At least eight lawsuits to be heard next year, including two unreported cases, involve fatal or serious crashes that occurred while drivers were allegedly relying on Autopilot. The complaint alleges that Tesla exaggerated the capabilities of features that control steering, speed, and other actions normally left to the driver. As a result, the company created a false sense of complacency, leading to tragedy for drivers, the lawsuit alleges.

Evidence emerging in the case, including dashcam footage obtained by The Washington Post, reveals sometimes shocking details. In Phoenix, a woman who was reportedly relying on Autopilot crashed into a disabled car, got out of her Tesla, and was then struck and killed by another car. . In Tennessee, an intoxicated man who allegedly used Autopilot drove on the wrong side of the road for several minutes and crashed into an oncoming vehicle, killing a 20-year-old man inside.

Tesla maintains that it is not responsible for accidents because the driver is ultimately in control of the vehicle. But that claim has come under increasing pressure from federal regulators and others. Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, finding that a December recall failed to significantly ameliorate abuses of the technology and that drivers believe “Self-driving has better capabilities than it actually does.'' '', he expressed concern that there was a misunderstanding.

Meanwhile, in a surprising development, Tesla this month settled a high-profile lawsuit in Northern California alleging that Autopilot played a role in the fatal crash of Apple engineer Walter Huang. The company has decided to reach a settlement with Huang's family. ruling A Florida judge has concluded that Tesla “knew” that its technology was “defective” under certain conditions, adding new ground to what was previously thought to be a difficult case. Legal experts said the move was gaining momentum.

“Increasingly, these cases are going to jury trials,” said Brett Schreiber, an attorney at Singleton Schreiber who represents the family of Giovani Maldonado, 15, who was killed in Northern California. The liquidation is nearing.” In 2019, a Tesla on autopilot rear-ended her family's pickup truck.

Tesla did not respond to multiple requests for comment on the lawsuit.

Dashcam footage from August 2019 shows the Tesla rear-ending another car. A 15-year-old male passenger in the other car was killed. (Video: Obtained from The Washington Post)

The outcome of litigation could have a significant impact on the company. Tesla stock has lost more than a third of its value since the beginning of the year. Last week, the company announced a sharper-than-expected 55% drop in first-quarter profit as it struggled with declining electric vehicle sales and fierce competition from China. To allay investor concerns, Musk made lofty promises to launch fully autonomous “robotaxis” in August. He said on Tuesday's earnings call that driving a car will soon be like riding an elevator: you get in and get off at your destination.

“We should be thought of as an AI or robotics company,” Musk told investors. “If someone doesn't believe Tesla is the solution to autonomy, I don't think they should invest in the company. But we will.”

Meanwhile, the company defended itself in court documents, arguing that user manuals and on-screen warnings make it “very clear” that the driver must be in full control when using Autopilot. Many of the upcoming lawsuits involve driver distraction and impaired driving ability.

Tesla said in response to a lawsuit filed in Florida in 2020 that Autopilot “is not self-driving technology and does not replace the driver.” “The driver can and must brake, accelerate, and steer as if the system were not activated.”

But Huang's case may also involve distracted drivers. Huang was said to have been playing a video game when his Tesla crashed into a highway barrier in 2018. Tesla did not say why it decided to settle the lawsuit, and details of the settlement were not disclosed. In court documents.

More details of deadly crash revealed

Meanwhile, federal regulators appear increasingly sympathetic to claims that Tesla is overselling its technology and misleading drivers. Even the decision to call the software Autopilot “evokes the idea that the driver is not in control” and leads to “driver over-reliance on automation,” NHTSA said Thursday, adding that the two-year The investigation revealed that 467 crashes related to the accident were identified. Thirteen of those cases were fatal.

NHTSA did not provide specific information about these crashes. However, two fatal accidents that occurred in 2022 have been revealed in detail in a previously unreported lawsuit.

Ewanda Mitchell, 49, was driving a Tesla in Phoenix. In May 2022, she rear-ended a Toyota Camry that had stalled on the highway, according to court documents and dashcam footage obtained by the newspaper. Mitchell's family's attorney, Jonathan Michaels of MLG Attorneys, said Autopilot and other car features, such as forward collision warning and automatic emergency braking, were not activated until Mitchell's Tesla took evasive action and stalled the sedan. It is said that he was unable to prevent the vehicle from crashing into the vehicle.

As Mitchell got out of his car, he was struck by an oncoming vehicle and died.

Tesla did not respond to a request for comment on this story. In response to the complaint in January 2024, Tesla denied his accusations and said it had “not yet had the opportunity to inspect” Mitchell's vehicle.

Ewanda Delsey Mitchell's Tesla couldn't find her broken-down sedan in the middle of a Phoenix freeway. (Video: Obtained from The Washington Post)

About a month later, in Sumner County, Tennessee, Jose Roman Jaramillo Cortez drank two beers and three tequila shots after work at a local restaurant, then hopped into a Tesla Model 3, court documents say. states. He said he connected his address to the Tesla's GPS and turned on Autopilot.

The car then stopped on the wrong side of the road, according to a lawsuit filed in June 2023 and dashcam footage obtained by the Post. The Tesla traveled south in the northbound lanes for several minutes before colliding with a car driven by Christian Malone, 20, who died on impact. “The accident was caused by the driver's negligence and/or recklessness,” Tesla said in its response to the complaint.

Trial dates for both cases are expected to be set for late next year, Michaels said.

In a separate case scheduled for trial in November in Key Largo, Florida, a self-driving Tesla failed to detect an approaching T-junction while the driver searched for a dropped cell phone. It is said that The Tesla crashed through flashing lights and a physical barricade and crashed into a parked vehicle on the side of the road, killing a woman and seriously injuring a man.

In court documents, Tesla He argued that the ultimate responsibility for the trajectory of a vehicle rests with the driver. Tesla also notes in its user manual that Autopilot may not work as intended “if lane markings cannot be accurately determined” or if “bright light obstructs the camera's view.” says.

If these cases go to trial, juries could be asked to consider whether Tesla's warnings to many drivers are enough to exonerate the company. Ross Gerber, CEO of Gerber Kawasaki Wealth and Investment Management, said the last thing his company needed was a highly publicized legal battle that would draw attention to these issues. .

At trial, “the defense is going to dig into the weeds… and it's going to be very clear that the perception of the Autopilot software is very different from reality,” Gerber said. “Every day is going to be a headline and it's going to be embarrassing.”

So far, Tesla has only faced a jury once over Autopilot's possible role in a fatal crash. Last year in Riverside, California, a jury heard the case of Mika Lee, 37. He was reportedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 105 mph, crashed into a palm tree and burst into flames. Mr Lee died from his injuries, and his fiance and son were seriously injured.

Tesla said it could not prove whether Autopilot was activated at the time of the crash due to the extensive damage to the car. During the trial, Tesla's lawyer Michael Carey argued that there was no defect in the technology and that the accident was “classic human error.” A toxicology report taken after the crash showed that Lee had alcohol in his system, but within the legal limit in California.

“This case is not about Autopilot. Autopilot did not cause the crash,” Carey said in his opening statement. “This was a terrible crash with serious injuries, and it may have resulted from a terrible mistake, but you can't blame the car company when that happens. This is a good car with a good design.”

In the end, Tesla's claims prevailed, and the jury found Tesla not liable.

But the company appears to be facing headwinds in other cases as well. Last year, Florida Circuit Court Judge Reid Scott Upheld the plaintiff's claim for punitive damages in a 2019 fatal accident in Delray Beach, Florida, when Jeremy Banner and his Tesla failed to use Autopilot to recognize a semi-truck crossing its path. did. The car crashed under the truck at full speed, killing Banner on impact.

Video obtained exclusively by The Washington Post shows the moment a Tesla on Autopilot crashes into a parked truck on a rural road in Florida in 2019. (Video: Obtained by The Washington Post)

In his ruling, Scott said the family's attorneys had presented “sufficient” evidence to reasonably seek punitive damages that could run into the millions of dollars at trial.

The plaintiff's evidence included that Tesla “knew that the vehicles in question were equipped with a defective Autopilot system,” according to the order. Scott cited other Autopilot-related fatalities and argued there was a “genuine” dispute over whether Tesla “created a foreseeable area of ​​danger that posed a general threat of harm to others.” I wrote that there is.

Tesla's appeal against this ruling is pending.

Change in defensive strategy?

Tesla settles these lawsuits despite Musk vowing on Twitter in 2022 that he would never settle “unjust lawsuits against us, even if we lose” It shows a new desire.

In addition to settling Hwang's lawsuit, Tesla will discuss a possible settlement of Riverside's lawsuit, which was presented to a jury last fall, said Michaels, an attorney at MLG who represented Lee's family. He said he “suggested” that he was ready.

The month-long trial featured testimony from accident reconstruction experts, top Tesla engineers and emergency personnel who responded to the accident, who described the accident as one of the most horrific. Accidents he's ever seen. Michaels said he declined to participate in settlement negotiations because he “wanted to make this a really public issue.” He also said: “He was not confident that he would be able to achieve a satisfactory amount.”

Tesla and Carey, its attorney in the case, did not respond to requests for comment.

After four days of deliberations, the jury found in Tesla's favor.

Although Michaels lost, he said the case garnered media attention and gave other lawyers working on cases against Tesla insight into the company's defense strategy. Additionally, he said his law firm's phones have been swamped with potential clients since then.

“We got out of bond,” Michaels said. “But that wasn't the point.”



Source link

2024-04-28T05:34:13-07:00