Elon Musk And His Fans Are Losing Their Minds Over CNN’s Autopilot Criticism – Jalopnik

Posted: November 19, 2021 at 6:20 pm

Yesterday, CNN published a video of a test in New York City of Teslas so-called Full Self-Driving Beta Level 2 driver-assist software, and it didnt go all that great. The tester was purple-sweater-enthusiast and ex-Jalop Mike Ballaban, who did admit to being a little skittish about using the semi-automated driving system in the car. The Tesla enthusiast blog Teslarati wrote about the test, and laid the blame for the systems less-than-stellar performance clearly at the feet of who they call an inexperienced operator. This take is, to be charitable, absolutely ridiculous.

Just so were all on the same page, heres the CNN video:

Ballabans drive was happily free of any wrecks, the direct result of Ballaban taking control away from FSB Beta when he saw it was making poor decisions. Which is precisely what the role of the person in the drivers seat should be when dealing with any Level 2 semi-automated system: be alert and ready to take control.

The Teslarati article notes this lack of wrecks, too, but comes to a very different conclusion:

Fortunately, the CNN respondent did not encounter any accidents during his FSD Beta drive. Things could easily go south, after all, when a driver who is a little skittish uses a system that he is completely unfamiliar with, and one that requires operators to keep their composure and control at all times.

Lets be very clear here about whats being described here: when the Teslarati writer talks about things could easily go south, hes referring to things going south because of poor driving decisions made by the FSD Beta software. The need for a driver to keep their composure and control at all times is because, as Tesla themselves have clearly stated, the system may do the wrong thing, at the worst time.

The article effectively states the same thing, saying:

This is because, in the hands of an operator that is unfamiliar with Teslas driver-assist techologies [sic], FSD Beta could be a lot to handle.

Why could it be a lot to handle? Because it doesnt always work so well.

So my question here is, how is the way the system performed here somehow Ballabans fault? His admitted skittishness didnt prevent him from taking control in the situations where FSD Beta wanted to, say, run into a UPS truck, as seen in the video. His skittishness wasnt a reaction to anything about that car other than FSD Beta and how it drove.

He could have driven that Model 3 around the city just fine without any skittishness at all well, no more than the baseline amount of lovable skittishness that Mike Ballaban lives with on a daily basis.

The Teslarati article has a lot of issues with the tone of the video, more than anything else:

This is because, in the hands of an operator that is unfamiliar with Teslas driver-assist techologies, FSD Beta could be a lot to handle. Unfortunately, this became extremely evident in a recent video from CNN, which featured one of the news agencys respondents using FSD Beta in inner-city streets. The footage, which has been shared on social media, is quite painful if not a bit frightening to watch.

With these factors in mind, the CNN driver did not disappoint on the skittish part, as he seemingly panicked at several points during his drive...It gets chaotic and dramatic, and completely opposite of what FSD Beta videos are like from actual experienced Tesla owners.

I think here we have the crux of what the Teslarati writer is displeased with: the driver of a car that was making poor driving decisions on its own expressed discomfort and displeasure that the car was attempting to do stupid things.

The writer also states that what we saw in this CNN video was completely opposite of what FSD Beta videos are like from actual experienced Tesla owners, and hes correct because Ballaban didnt have any perverse need to try and make it seem like a car attempting to steer you into oncoming traffic was fine. Just fine!

Again, lets be absolutely clear here: Ballaban did exactly what an L2 driver is supposed to do: take over when they feel the car is not making optimal decisions. The only thing Ballaban did wrong, at least according to Teslarati, is not pretend like he was somehow cool with what FSD Beta was doing. He complained.

Complaining is something Mike is very good at, and if the goal of the video was to show what a normal, non-Telsa-owning person might think of FSD Beta (there are a good number of such people on Earth, numbering in the billions), then I think he did a very effective job.

Whats incredible about this article is that its taking a non-ideal example of an FSD Beta drive and putting the blame for the poor way FSD Beta came out looking not on the software itself, but on the human in the seat who did not make the poor driving decisions shown.

This would be like if I loaned my Yugo out to someone, knowing full well that the car is full of flaws, quirks, and some downright questionable issues and then got mad at them if the brakes failed, and they slammed into something.

I knew the brakes were kinda shitty when I loaned them the car, and I would have told them that. If they were skittish while driving because the brakes sucked, thats not their fault its the fault of my shitty Yugos brakes.

Its the same thing with FSD Beta: if the system is making driving decisions that make a driver skittish, then its the fault of the software, not the driver.

Now, even though FSD Beta is being deployed onto public roads, surrounded by pedestrians and cyclists and other motorists and squirrels that have not agreed to participate in a test of semi-self-driving software on a 4,000-pound car, Tesla is more careful about who they allow inside the car thats being beta tested.

For this, they have concocted a Safety Score system, which effectively gamifies smooth, if not necessarily always the safest, driving to decide who can have access to FSD Beta. The Teslarati article notes this:

There is a reason why Tesla is incredibly serious about the strict safety measures it is imposing on the FSD Beta program. Drivers who have access to the FSD Beta software, at least for now, are only those that have garnered the highest Safety Scores, at least outside the initial batch of testers that used the advanced driver-assist system in its first year.

The problem is that the Safety Score isnt really a training program to make people qualified to monitor an unfinished, car-controlling AI. Getting a 100 on your safety score just means you were able to meet the very specific criteria mandated by the Safety Score algorithm, even if that meant doing a lot of extra driving in unnatural ways to get there. Its got nothing to do with the very specific task of monitoring an AI driver.

Other companies testing automated driving systems take their human safety driver qualifications very differently. Cruise, for example, has a month-long training program for their safety drivers.Argo AIs Test Specialists go through multi-week training courses as well, only certifying less than four percent of applicants. And, significantly, nearly all other companies developing automated vehicles pay their safety drivers.

So, Teslarati, dont give me this unqualified driver bullshit. The only thing the Safety Score really does is weed out those people not sufficiently thrilled to be an unpaid beta tester for a for-profit company.

Remember, no wrecks happened in that CNN video. The outcome was the same as so many other full-Tesla-owner-shot FSD Beta videos, in that the car drove around on its own a bit, it had some moments that required disengagement, maybe some surrounding drivers got confused or pissed a bit and everyone got home fine.

The only real difference is the CNN video featured a driver who was visibly nervous and displeased with some of the cars choices.

This video wasnt a hit job. It was a real, honest portrayal of the current state of FSD Beta. And, if were talking about drama here, then we really cant leave out this part of the Teslarati article:

There are already a lot of entities waiting for FSD Beta to fail. Tesla critics and skeptics, several of whom are financially incentivized to bring the company down, are practically salivating at the thought of FSD Beta being banned by regulators over potential accidents.

Jeezis, come on. Tesla owners who are beta testing FSD Beta arent victims of anything. No one is out to get you. Hell, I know for a fact that Mike Ballaban has, at worst, mixed feelings about Tesla, because he said so, right here:

...its okay to have mixed feelings about Teslato fall somewhere between actively rooting for its ugly demise and pledging your spare organs to God Emperor Elon so he can live forever on the Mars Colony. Because both are silly.


It is possible for all of these things to be true, simultaneously. Its possible that we see all of these things, together, and yet we dont want Tesla to die. Were huge fans of consumer choice, innovation and the inevitable electric future of cars, and for there to be more of those things, Tesla needs to succeed. Cars, and the car industry, are all simply more interesting because Elon Musk and Tesla are here. We even want Tesla to succeed because at the end of the day, saving the planet through fewer emissions is not only a good thing, its a matter of life and death itself.

Hes not hoping for Tesla to fail. Im not hoping for Tesla to fail. I dont have any financial stake in it either wayhell, my investment portfolio consists of two 20-pound bags of ice I bought last weekend and the vague hope that my kidneys will retain their value.

For the sake of car culture in general, the absurd victim mentality of Tesla supporters really needs to cool down. I mean, look at some of these comments on the Teslarati story:

Okay, here, a few things: this has nothing to do with EVs, its about partial automation, and the driver didnt mess up repeatedly the software did. The driver took over every time.

Then we get to the more exciting conspiracy takes:

So...why would CNN want to let China take over the EV auto industry? I dont see the angle there.

Now, some commenters were a bit more open in their assessment:

That seems reasonable! Now, heres the responses this comment gathered:

Okay, great. Thanks for that. And:

You get the idea.

Really, this reaction to the CNN video is highly revealing, even fascinating, because I cant recall such a united front in automotive culture quite like what we see in Tesla fandom.

We have a review of a feature of a car, and a response to that review that essentially focuses on how the reviewer was made uncomfortable by the cars feature instead of focusing on the performance of that feature itself.

This is a strange and unhealthy trend, and its showing no signs of dissipating any time soon.

Oh boy.

UPDATES: Right as this was published, we noticed Elon Musk himself was responding to the CNN video, via Twitter:

Here, based on the selected emojis, the concept that Ballaban was acting has brought his yellow, angled head to tears via laughter, and hes displaying an upraised thumb in a gesture of approval.

Elon also decided to just do some wild-ass speculation:

I mean, since the CNN story was largely a video, Im not exactly sure how the story would have been written before the drive took place? Or does he mean that Teslarati story? Its confusing.

If this gets any more stupid, well update.

Read more:

Elon Musk And His Fans Are Losing Their Minds Over CNN's Autopilot Criticism - Jalopnik

Related Post