Teslas Autopilot keeps causing its cars to crash – Vox.com

Tesla is facing heat from federal officials following another fatal accident involving Autopilot. The National Transportation Safety Board (NTSB) recently found that Teslas semi-autonomous driving feature was partially to blame in a 2018 fatal car crash, adding yet another accident to the technologys already worrisome record. Whats even more concerning is that Tesla doesnt appear too interested in addressing these concerns.

That Teslas Autopilot has been implicated in a crash isnt new. In fact, after this investigation, NTSB chairman Robert Sumwalt pointed out that in 2017 his agency called on Tesla and five other carmakers to limit self-driving features and to build better technology to monitor drivers in semi-autonomous cars. Tesla is the only company that hasnt formally responded to those recommendations, though it did start warning drivers more quickly when they take their hands off the wheel.

But it seems the company is unwilling to address its self-driving technologys shortcomings or to ensure that its drivers properly understand what the Autopilot feature can and cant do. The NTSBs findings serve as a stark reminder that the federal government has a role to play in regulating these technologies, and furthermore, its light-touch approach doesnt seem to be working.

We urge Tesla to continue to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken when necessary, Sumwalt told reporters. Its time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.

Heres the background: Two years ago, a 2017 Model X that had its Autopilot feature engaged was driving along a highway in Mountain View, California, when it struck a concrete barrier at a speed over 70 miles an hour. The crash was ultimately fatal for the driver, who died of injuries related to blunt force trauma.

After a months-long investigation, the agency identified seven safety issues related to the crash, including limitations to Teslas crash avoidance system and driver distraction. Among them, it appears that the driver was playing a game on an iPhone provided by his employer, Apple, and that he didnt notice when the Autopilot steered the electric vehicle off-course.

The Tesla Autopilot system did not provide an effective means of monitoring the drivers level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the drivers response to prevent the crash or mitigate its severity, reads the report. Tesla needs to develop applications that more effectively sense the drivers level of engagement and that alert drivers who are not engaged.

The board also found that Tesla needed a better system for avoiding collisions. Like many semi-autonomous driving systems, Teslas Autopilot can only detect and respond to situations that it is programmed and trained to deal with. In this case, the Tesla Model X software never detected a crash attenuator a barrier intended to reduce impact damage that was damaged and not in use at the time of the crash causing the car to accelerate.

Tesla didnt respond to Recodes request for comment by the time of publication.

So what happens now? Tesla has argued that its cars are safer than average vehicles, but these crashes keep happening, and fatal crashes involving Autopilot seem increasingly common. Meanwhile, Consumer Reports has continued to find issues with vehicles with these autonomous abilities. Last year, the organization reported that Autopilots Navigate feature could lag far behind a human drivers skills.

Security researchers have also said that it wouldnt take too much to trick these vehicles. Researchers have shown how placing stickers on the road could coax a Tesla into dangerously switching lanes while the Autopilot system was engaged. And last week, the computer security company McAfee released findings that a Tesla using the intelligent cruise control feature could be tricked into speeding by placing a small strip of electric tape onto speed limit signs.

Shortcomings like these are why its so important for drivers to pay attention. Nearly three years ago, the NTSB called for car companies implementing these autonomous systems like Autopilot to create better mechanisms for monitoring drivers while these tools are turned on, in part to alert them when they need to take control of the vehicle. Tesla is the only auto company of six that hasnt formally responded to the federal agency.

Meanwhile, research from the Insurance Institute for Highway Safety, a nonprofit thats supported by car insurance companies, found that drivers can misunderstand the autonomous capabilities of their vehicles, including Teslas Autopilot.

And Tesla is known for overstating its vehicles abilities. On and off in recent years, the company has described its cars as having full self-driving capabilities or has advertised that the vehicles have full self-driving hardware, despite the need for drivers to stay engaged while on the road. Whenever criticism over this sort of marketing language reaches a breaking point, however, Tesla has removed the language. The Tesla website currently paints a confusing picture of its cars capabilities:

All that marketing copy aside, a Tesla using the Autopilot feature is nowhere near a fully autonomous car. The issues that have cropped up around Autopilot have raised concerns about the new safety issues that self-driving vehicles could introduce. More importantly, these issues have bolstered demands for regulators to test this technology more stringently and hold carmakers accountable when they build dangerous tech.

Whether or not that will actually happen is unclear. The Trump administration has, in fact, encouraged federal agencies not to needlessly hamper innovation in artificial intelligence-based technology, and, earlier this year at the Consumer Electronics Show (CES) in Las Vegas, the Department of Transportation Secretary Elaine Chao announced new rules that are meant to standardize and propel the development of self-driving cars. Those rules wont do much good if companies leading the charge toward this futuristic technology, like Tesla, refuse to follow or even acknowledge them.

So its time for Tesla to do something different. At the very least, the company could answer government regulators calls to develop better ways to monitor drivers as it continues to improve its self-driving technology. Obviously, Autopilot doesnt live up to its name quite yet, so either the company fixes it, or it can risk endangering the lives of its drivers.

For now, please dont text and drive. Its dangerous. And if you own a Tesla, definitely dont text and drive or play a mobile game when youre using Autopilot. Thats potentially even more dangerous, since you might feel a false sense of security. Overestimating the abilities of technology like Autopilot puts your life and the lives of others at risk.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

See the article here:

Teslas Autopilot keeps causing its cars to crash - Vox.com

Related Posts

Comments are closed.