Tesla safety challenged again

In the fall of 2016, Tesla beamed new software over the air to cars on the road in the United States and elsewhere that added safeguards to its Autopilot system to prevent drivers from looking away from the road or keeping their hands off the steering wheel for long periods.

The move came in the wake of a crash in Florida in which an Ohio man died when his Model S sedan hit a tractor-trailer while Autopilot was engaged. Federal investigators found that the driver’s hands had been on the steering wheel for only a few seconds in the minute before the crash.

When the upgrades were released, Tesla’s chief executive, Elon Musk, said the new Autopilot system was “really going to be beyond what people expect” and would make the Tesla Model S sedan and the Model X sport utility vehicle the safest cars on the road “by far.”

Now, however, Tesla’s semiautonomous driving system is coming under new scrutiny after the company disclosed late Friday that a fatal crash on March 23 in California occurred while Autopilot was engaged.

The company said the driver, Wei Huang, 38, a software engineer for Apple, had received several visual and audible warnings to put his hands back on the steering wheel but had failed to do so, even though his Model X SUV had the modified version of the software. His hands were not detected on the wheel for six seconds before his Model X slammed into a concrete divider near the junction of Highway 101 and 85 in Mountain View, and neither Huang nor the Autopilot activated the brakes before the crash.

The accident renews questions about Autopilot, a signature feature of Tesla vehicles, and whether the company has gone far enough to ensure that it keeps people safe.

“At the very least, I think there will have to be fundamental changes to Autopilot,” said Mike Ramsey, a Gartner analyst who focuses on self-driving technology. “The system as it is now tricks you into thinking it has more capability than it does. It’s not an autonomous system. It’s not a hands-free system. But that’s how people are using it, and it works fine, until it suddenly doesn’t.”

On Saturday, Tesla declined to comment on the California crash or to make Musk or another executive available for an interview. In its blog post Friday about the crash, the company acknowledged that Autopilot “does not prevent all accidents,” but said the system “makes them much less likely to occur” and “unequivocally makes the world safer.”

For the company, the significance of the crash goes beyond Autopilot. Tesla is already reeling from a barrage of negative news. The value of its stock and bonds has plunged amid increasing concerns about how much cash it is using up and the repeated delays in the production of the Model 3, a battery-powered compact car that Musk is counting on to generate much-needed revenue.

It is also facing an investor lawsuit related to Tesla’s acquisition of SolarCity, a solar-panel maker where Musk was serving as chairman. Meanwhile, competition is mounting from other luxury carmakers that have developed their own electric cars, while Waymo, the Google spinoff, General Motors and others seem to have passed Tesla in self-driving technology.

“There’s a lot going on that undermines Elon’s credibility right now,” said Karl Brauer, a senior analyst at Kelley Blue Book.

Autopilot uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate with little input from the driver. Tesla readily points out that Autopilot — despite the implications in its name — is only a driver-assistance system and is not intended to pilot cars on its own.

Drivers are given warnings on the dashboard and in the owner’s manual to remain engaged and alert while using it. Tesla originally described it as a “beta” version, a term that usually refers to software still in the developmental stage.

At least three people have now died while driving with Autopilot engaged. In January 2017, a Chinese owner was at the wheel of a Model S when the car crashed into a road sweeper on a highway.

The National Transportation Safety Board is investigating the March 23 crash that killed Huang. Its investigation of the 2016 Florida accident concluded that Autopilot “played a major role,” and said that it lacked safeguards to prevent misuse by drivers.(

Commenting rules:

  1. Stick to the topic of the article/letter/editorial.
  2. When responding to issues raised by other commenters, do not engage in personal attacks or name-calling.
  3. Comments that include profanity/obscenities or are libelous in nature will be removed without warning.

Violators’ commenting privileges may be revoked indefinitely. By commenting you agree to our full Terms of Use.

Source

http://watertowndailytimes.com/national/tesla-safety-challenged-again-20180401

Your email address will not be published. Required fields are marked *

div#stuning-header .dfd-stuning-header-bg-container {background-image: url(https://wshasia.com/wp-content/uploads/2018/09/event-banner2.jpg);background-size: cover;background-position: center center;background-attachment: scroll;background-repeat: no-repeat;}#stuning-header div.page-title-inner {min-height: 500px;}
DON’T MISS OUT!
Subscribe To Our Newsletter
Be the first to get latest updates and exclusive content on Workplace Safety & Health straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
*We don’t share your personal info with anyone. Check out our Privacy Policy for more information
close-link