New MIT study confirms Tesla’s autopilot is indeed unsafe – Screen Shot

Posted: September 24, 2021 at 11:41 am

A month ago, towards the end of August 2021, the National Highway Transportation Safety Administration (NHTSA) launched an investigation into Teslas Autopilot system after it was found responsible for 11 accidents, resulting in 17 injuries and one death. Now a new study, conducted by the Massachusetts Institute of Technology (MIT), has confirmed how unsafe Elon Musks infamous autopilot feature actually is.

Titled A model for naturalistic glance behavior around Tesla Autopilot disengagements, the study backs up the idea that the electric vehicle companys Full-Self Driving (FSD) system is in factsurprise, surprisenot as safe as it claims. After following Tesla Model S and X owners during their daily routine for periods of a year or more throughout the greater Boston area, MIT researchers found that, more often than not, they become inattentive when using partially automated driving systems. Note here that I went from calling the autopilot a Full-Driving systemwhich is the term Tesla uses to describe it and therefore entails it is fully autonomousto then qualifying it of an automated driving system, also known as an advanced driver assist system (ADAS), which is what it truly is.

Visual behavior patterns change before and after [Autopilot] disengagement, the study reads. Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead. To be completely fair, it does make sense that drivers would feel less inclined to be attentive when they think their cars autopilot is fully in control. Only thing is, it isnt.

Meanwhile, by the end of this week, Tesla will roll out the newest version of its autopilot beta software, the version 10.0.1 in this case, on public roadscompletely ignoring the current federal investigation when it comes to the safety of its system. Billionaire tings, go figure.

Musk has also clarified that not everyone who has paid for the FSD software will be able to access the beta version, which promises more automated driving functions. First things first, Tesla will use telemetry data to capture personal driving metrics over a 7-day period in order to ensure drivers are still remaining attentive enough. The data might also be used to implement a new safety rating page that tracks the owners vehicle, which is linked to their insurance, added TechCrunch.

In other words, Musk is aware of the risk the current autopilot system represents, and hes working hard on improving it, or at least making sure hes not going to be the one to blame if more Tesla-related accidents happen. How do you say your autopilot is not an autopilot without clearly saying itand therefore risking to hurt your brand? You release a newer version of it that can easily blame drivers for their carelessness, duh.

The researchers found this type of behavior may be the result of misunderstanding what the [autopilot] feature can do and what its limitations are, which is reinforced when it performs well. Drivers whose tasks are automated for them may naturally become bored after attempting to sustain visual and physical alertness, which researchers say only creates further inattentiveness, continued TechCrunch.

My opinion on Musk and Tesla aside, the point of the MIT study is not to shame Tesla, but rather to advocate for driver attention management systems that can give drivers feedback in real-time or adapt automation functionality to suit a drivers level of attention. Currently, Teslas autopilot system doesnt monitor driver attention via eye or head-trackingtwo things that researchers deem necessary.

The technology in questionwhich is a model for glance behaviouralready exists, with automobile manufacturers like Mercedes-Benz and Ford allegedly already working on implementing it. Will Tesla follow suit or will Musks only child energy rub off on the company?

The rest is here:

New MIT study confirms Tesla's autopilot is indeed unsafe - Screen Shot

Related Posts