The little things pop-ups, notifications, warnings work to fight fake news, new evidence shows – Nieman Journalism Lab at Harvard

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

In 2017, Facebook released a set of Tips to spot false news. Developed in collaboration with First Draft, the tips were promoted at the top of users news feeds in 14 countries in April 2017 and printed in full-page newspaper advertisements in the United States, the United Kingdom, France, Germany, Mexico, and India, write the authors of a study published this week in PNAS. A variant of these tips was later distributed by WhatsApp (a Facebook subsidiary) in advertisements published in Indian and Pakistani newspapers in 2018. These tips are therefore almost surely the most widely disseminated digital media literacy intervention conducted to date.

The researchers tested the effectiveness of these tips on audiences in the U.S. and India and found that they worked.

Strikingly, our results indicate that exposure to variants of the Facebook media literacy intervention reduces peoples belief in false headlines. These effects are not only an artifact of greater skepticism toward all information although the perceived accuracy of mainstream news headlines slightly decreased, exposure to the intervention widened the gap in perceived accuracy between mainstream and false news headlines overall. In the United States, the effects of the treatment were particularly strong and remained statistically measurable after a delay of approximately 3 weeks. These findings suggest that efforts to promote digital media literacy can improve peoples ability to distinguish between false and mainstream news content, a result with important implications for both scientific research into why people believe misinformation online and policies designed to address the problem.

A brief intervention which could be inexpensively disseminated at scale can be effective at reducing the perceived accuracy of false news stories, the authors conclude, helping users more accurately gauge the credibility of news content they encounter on different topics or issues.

Consumer Reports Kaveh Waddell (hes an investigative reporter at the Consumer Reports Digital Lab, which launched last year and which Im looking forward to reading more from) points out that Facebook itself could surely shed further light on the these research findings: The company should know how many people clicked on the media literacy list, how long they spent on that page, whether they later changed their reading or sharing habits, and how long any effects lasted. But its not sharing. These scholars did an amazing job of looking at the scale of the intervention with the tools they had available, but Im just so disappointed that there isnt a way for an independent audit of what happened on the platform, First Drafts Claire Wardle told Waddell.

On the topic of brief interventions, Facebook is taking a cue from The Guardian and will show a warning if users try to share a story thats more than 90 days old. (If they still want to share it after that, they can.) Other types of notifications may be coming, too. From Facebooks John Hegeman, VP of feed and stories:

Over the past several months, our internal research found that the timeliness of an article is an important piece of context that helps people decide what to read, trust and share. News publishers in particular have expressed concerns about older stories being shared on social media as current news, which can misconstrue the state of current events. Some news publishers have already taken steps to address this on their own websites by prominently labeling older articles to prevent outdated news from being used in misleading ways.

Over the next few months, we will also test other uses of notification screens. For posts with links mentioning COVID-19, we are exploring using a similar notification screen that provides information about the source of the link and directs people to the COVID-19 Information Center for authoritative health information. Through providing more context, our goal is to make it easier for people to identify content thats timely, reliable and most valuable to them.

(OK, now do it for Trumps posts.)

The Shorenstein Center has a report on COVID-19 misinformation in Black online communities in the United States an especially crucial topic since Black people are disproportionately affected by the coronavirus, dying of it at a higher rate than White people. Brandi Collins-Dexter identified four main strands of misinformation circulating some organic, some targeted directly at the community by outsiders.

1. Black people could not die from COVID-192. The virus was man-made for the purposes of population control3. The virus could be contained through use of herbal remedies4. 5G radiation was the root cause of COVID-19

Our research makes clear that the health misinformation surrounding COVID-19 poses an immediate threat to the health of Black people, and is a symptom of an information ecosystem poisoned by racial inequality, Collins-Dexter writes.

While there is much to be learned about COVID-19 and how it works, it is clear that misinformation and conspiratorial frames that suggest that Black people are somehow inoculated from the disease are both dangerous and patently untrue. Black lives are consistently put in danger, and it is incumbent upon community actors, media, government, and tech companies alike to do their part to ensure that timely, local, relevant, and redundant public health messages are served to all communities.

The Washington Posts Christopher Ingraham has a very useful, detailed roundup of three recent studies focused on conservative medias role in fostering confusion about the seriousness of the coronavirus. Taken together, they paint a picture of a media ecosystem that amplifies misinformation, entertains conspiracy theories and discourages audiences from taking concrete steps to protect themselves and others.

Continued here:

The little things pop-ups, notifications, warnings work to fight fake news, new evidence shows - Nieman Journalism Lab at Harvard

Related Posts

Comments are closed.