Tesla fails to fix autopilot after fatal crash

Tesla hasn’t adjusted the limits on its Autopilot system A driver was killed following a horrific crash in Florida in 2016The company’s engineers said a family lawsuit involving a similar fatal collision is headed for a jury trial in 2019.

The electric car maker has made no changes to its driver assistance technology to account for cross traffic in the nearly three years between two high-profile crashes that killed Tesla drivers. The cars crashed into the sides of the trucksAccording to the testimony of several engineers, recently disclosed.

For years, Tesla and CEO Elon Musk have come under legal pressure from consumers, investors, regulators and federal prosecutors to tout self-driving as the way of the future. Question whether the company overstated its progress toward autonomous vehicles In the last eight years.

Tesla is also in it An overview of several studies From the US National Highway Traffic Safety Administration about auto defects linked to at least 17 deaths since June 2021.


Mukc against experts

The trial, scheduled for October, will be the company’s first for a fatality blamed on Autopilot, and Musk has repeatedly insisted that Tesla’s cars are the safest they’ve ever made, against technicians expected to testify. The company’s marketing has slowed drivers down with a false sense of security.

Musk was granted immunity from prosecution in the case by a Florida judge last year. The billionaire CEO is „hands-on,” „very involved with product definition,” and „Very involved in making some decisions about how things should work” With Autopilot, according to parts of a 2020 report from Tesla’s former head of Autopilot software, Christopher „CJ” Moore.

READ  Data-Driven Decisions: The Secret to Success for Destination Marketing Organizations (DMOs).

Attorneys for Tesla did not immediately respond to requests for comment.

The automaker says that including the limitations of Autopiloting are obvious Challenges in detecting cross traffic in front of your cars. Tesla warns in its owner’s manual and in-car screens that drivers should be alert and ready to take control of vehicles at any time.


Tesla won its first trial of a fatal Autopilot crash earlier this year, when a Los Angeles jury acquitted the company. Badly behaved by a woman’s vow A driver assist feature in his Model S moved him into the center of a city street median

Case where Tesla collided with a tow truck

The case was heard before a jury in Palm Beach County, Florida, by the family of Jeremy Banner, a 50-year-old father of three who engaged his Model 3 on autopilot 10 seconds before it crashed. In 2019, it collided with the underbelly of a tractor-trailer. An investigation by the National Transportation Safety Board found that Banner did not see the truck crossing the two-lane road on his way to work. Autopilot didn’t see it either.

„Although the company knew there was a possibility of cross-traffic or cross-traffic, at that time Autopilot It is not designed to detect”, according to testimony from company engineer Chris Payne in 2021, in a recent court filing. Engineer Niklas Gustafsson gave a similar account in a 2021 report.

Last week, Banner’s widow reviewed his lawsuit seeking punitive damages, raising Tesla’s stock in the trial. He argues that the company should have reprogrammed the autopilot To go out in a dangerous situation Tesla driver Joshua Brown after crashing into the side of a truck in 2016.

READ  Technical Glossary: ​​What is product technology?

„Tesla knew of evidence that Defendant Tesla engaged in willful misconduct and/or gross negligence in selling a vehicle with an Autopilot system. He knows he caused a terrible accident earlierThe Banner family said in an amended complaint.

One of the expert witnesses retained by the Banner family is Mary 'Missy’ Cummings, who recently served as a consultant to the National Highway Traffic Safety Administration. Cummings, a professor at Duke University and a vocal skeptic of Autopilot, said in a court filing.E Tesla „Guilty of Willful Misconduct and Gross Negligence” Brown and Banner for not testing and upgrading the autopilot between the crashes.

Tesla has made public statements about its autonomous technology He’s more talented than he actually isCummings wrote.

The company said after the 2016 crash that it had modified the way its driver assistance system detects potential obstacles. He couldn’t make out the white side of the tractor-trailer against a bright sky. The new version emphasized a forward-scanning radar system instead of camera sensors, Tesla said.

An NTSB investigation into the 2016 collision recommended that automakers Limit the use of semi-autonomous systems to the road conditions for which they are designed.

Trey Lydell, an attorney for the Banner family, said Tesla agreed A „single defect” takes two lives three years apart.

„This flaw is not only known to Tesla, but also to US government regulators It has warned that this method should not be used on highways Cross traffic or people die,” he said in an emailed statement.

The case in question is called Banner V. Tesla Inc., 50-2019-CA-0099662, Circuit Court for the 15th Judicial Circuit, Palm Beach County, Florida.

READ  A digital solution to facilitate compliance with human and environmental rights

Dodaj komentarz

Twój adres e-mail nie zostanie opublikowany. Wymagane pola są oznaczone *