The Misinformation Age Has ExacerbatedAnd Been Exacerbated Bythe Coronavirus Pandemic – TIME

If youre looking for solid information on COVID-19, the Internet is not always your best betequal parts encyclopedia and junkyard, solid science on the one hand and rubbish, rumors and fabulism on the other. Distinguishing between the two is not always easy, and with so much of the time we spend online devoted either to sharing links or reading ones that have been shared with us, not only does the junk get believed, it also gets widely disseminated, creating a ripple effect of falsehoods that can misinform people and even endanger lives.

At its worst, misinformation of this sort may cause people to turn to ineffective (and potentially harmful) remedies, write the authors of a new paper in Psychological Science, as well as to overreact (hoarding goods) or, more dangerously, to underreact (engaging in risky behavior and inadvertently spreading the virus).

Its well-nigh impossible to keep the Internet entirely free of such trash, but in theory it ought not be quite as hard to confine it to the fever swamps where it originates and prevent it from spreading. The new study explores not only why people believe Internet falsehoods, but how to help them become more discerning and less reckless about what they share.

One of the leading reasons misinformation about the COVID-19 pandemic gains traction is that its a topic that scares the daylights out of us. The more emotional valence something we read online has, the likelier we are to pass it oneither to share the joy if its something good or unburden ourselves if its bad.

Our research has shown that emotion makes people less discerning, says David Rand, associate professor at the MIT School of Management and a co-author of the new study. When it comes to COVID-19, people who are closer to the epicenter of the disease are likelier to share information online, whether its true or false.

Thats in keeping with earlier research out of MIT, published in 2018 showing that fake news spreads faster on Twitter than does the truth. The reason, the researchers in that study wrote, was that the lies were more novel than true news [eliciting] fear, disgust and surprise in replies, just the things that provide the zing to sharing in the first place.

Political leanings also influence whats shared and not shared. A 2019 Science study, from researchers at Northeastern, Harvard, and SUNY-Buffalo, showed that neither the left nor the right has a monopoly on sharing fake news or real news, with both ends more or less equally mixing fact and fiction. Just which fact and just which fiction they chose, however, was typically consistent with just which stories fit more comfortably with their own ideologies.

To dig deeper still into the cognitive processes behind sharing decisions, Rand and colleagues developed a two-part study. In the first, they assembled a sample group of 853 adults and first asked them to take a pair of tests. One, known as the Cognitive Reflection Test (CRT) measures basic reasoning processes, often with questions that are slipperier than they seem. (For example: If you are running a race and you pass the person in second place, what place are you in? The seemingly obvious answerfirst placeis wrong. Youve simply replaced the second-place runner, but the person in first is still ahead of you.)

The other test was more straightforwardmeasuring basic science knowledge with true and false statements such as Antibiotics kill viruses as well as bacteria (false); and Lasers work by focusing sound waves (false again).

Finally, the entire sample pool was divided in half. Both groups were shown the same series of 30 headlines15 false and 15 trueabout COVID-19, but they were instructed to do two different things with them. One group was asked to determine the accuracy or inaccuracy of the headlines. The other group was asked if they would be inclined to share the headlines online.

The results were striking. The first group correctly identified the truth or falsehood of about two thirds of the headlines. The second groupfreed from having to consider the accuracy of what they were readingreported that they would share about half of the headlines, equally divided between true ones and false ones. If they were taking the time to evaluate the headlines veracity, they would be expected to share at something closer to the rate of the first groupabout two thirds true and one third false. When people dont reflect, they make a rapid choice and they share without thinking. This is true for most of us. says Gordon Pennycook, assistant professor at the University of Regina School of Business in Saskatchewan, and lead author of the study.

Most, but not all. The study did find that people who scored higher on the CRT and basic science tests were a little less indiscriminate, tending to do a better job at both distinguishing false stories and at making better sharing decisions.

The solution, clearly, is not to force everyone to pass a reasoning test before theyre admitted online. Things are actually a lot easier than that, as the second part of the study showed.

For that portion, a different sample group of 856 adults was once again divided in two and once again shown the same set of headlines. This time, however, neither group was asked to determine the accuracy of the headlines; instead, both were asked only if they would share them. But there was still a difference between the two groups: One was first shown one of four non-COVID-9-related headlines and asked to determine whether it was true or false.

That primingasking the participants to engage their critical faculties before beginning the sharing taskseemed to make a dramatic difference: The primed group was three times less likely to share a false headline than the unprimed group.

Nudges like this help a lot, Rand says. If you get people to stop and think, they do a better job of evaluating what theyre reading.

The researchers believe there are easy, real world applications that platforms like Facebook and Twitter could use to provide people the same kind of occasional cognitive poke they did in their study. One idea we like is to crowd-source fact-checking out to users, Pennycook says. Ask people if [some] headlines are accurate or not; the platforms themselves could learn a lot from this too.

Rand cautions against anything that could seem patronizing to readersleaving them feeling like theyre being quizzed by some social media giant. Instead, he recommends a little bit of humility.

You could stick little pop-ups into newsfeeds that say, Help us improve our algorithms. Are these stories accurate? he recommends.

In no event is the Internet going to be scrubbed of all rubbish. For plenty of hucksters, politicos and conspiracy-mongers, the Internets hospitality to inaccuracies is a feature, not a bug, and there is little way to purge them entirely. But small interventions can clearly make a difference. And when it comes to information about the pandemicon which life and death decisions may be madethe stakes for trying could not be higher.

This appears in the August 03, 2020 issue of TIME.

For your security, we've sent a confirmation email to the address you entered. Click the link to confirm your subscription and begin receiving our newsletters. If you don't get the confirmation within 10 minutes, please check your spam folder.

Write to Jeffrey Kluger at jeffrey.kluger@time.com.

Follow this link:

The Misinformation Age Has ExacerbatedAnd Been Exacerbated Bythe Coronavirus Pandemic - TIME

Related Posts

Comments are closed.