Deepfakes and Fake News Pose a Growing Threat to Democracy – Northeastern University

Posted: April 2, 2022 at 5:57 am

This report is part of ongoing coverage of the Russia-Ukraine war.Visit our dedicated pagefor more on this topic.

In mid-March, as the Russian invasion of Ukraine crept into its third week, an unusual video started making the rounds on social media and was even broadcast on the television channel Ukraine 24 due to the efforts of hackers.

The video appeared to show Ukrainian President Volodymyr Zelenskyy, stilted with his head moving and his body largely motionless, calling on the citizens of his country to stop fighting Russian soldiers and to surrender their weapons. He had already fled Kyiv, the video claimed.

Except, those werent the words of the real Zelenskyy. The video was a deepfake, or content constructed using artificial intelligence. In a deepfake, individuals train computers to mimic real people to make what appears to be an authentic video. Shortly after the deepfake was broadcast, it was debunked by Zelenskyy himself, removed from prominent online sources like Facebook and YouTube, and ridiculed by Ukrainians for its poor quality, according to the Atlantic Council.

However, just because the video was quickly discredited doesnt mean it didnt cause harm. In a world increasingly politically polarized, in which consumers of media may believe information that reinforces their biases, regardless of the contents apparent legitimacy, deepfakes pose a significant threat, warns Northeastern University computer science and philosophy professor Don Fallis.

Left to right: Don Fallis, Northeastern University computer science and philosophy professor; and David Lazer, distinguished professor of political science and computer and information science. Courtesy photo and Photo by Adam Glanzman/Northeastern University

Its sort of interesting the respect in which it wasnt a particularly high-quality deepfake. There were all sorts of indicators that the individual consumer of information might think, This doesnt look right, Fallis says about the deepfake of Zelenskyy. That being said, with all of these sources of misinformation, no matter how credible the information looks, if you have a strong leaning toward a particular viewpoint, if you receive information confirming that pre-existing bias, the source of that informationand the plausibility of that informationmay not matter.

In his research, Falliswho studies epistemology, or the theory of knowledgetries to put modern issues, like deepfakes and fake news, into the larger philosophical context of how individuals acquire and digest true knowledge, as well as misinformation.

In 2018, he co-authored an article, titled, Fake news is counterfeit news, with Northeastern philosophy professor Kay Mathiesen. The article looked at the threat to democracy and knowledge that fake news poses and sought to define the concept. Two years later, he wrote an article about deepfakes, The Epistemic Threat of Deepfakes, in which he concluded that deepfakes can lead to false beliefs, undermine the justification for true beliefs, and prevent people from acquiring true beliefs.

Fallis argues both fake news and deepfakes have the negative effect of delegitimizing real news. He says they decrease the amount of true information available, reduce consumers trust in authentic media, and put an added burden on fact-checkers to authenticate the vast amount of content online.

In the case of fake news, youre creating this online presence thats supposed to look like a legitimate news site, Fallis says. Similarly, in the case of deepfakes, youre creating video and audio that are also supposed to look like legit media.

Additionally, in combination with tools used to collect individual users personal information en masse, deepfakes also can be used maliciously to target large audiences and manipulate them by playing on their ingrained biases, Fallis says.

It may not just be this one killer technology, he says. Its not like deepfakes are going to be the one thing that takes us over the cliff. Its a whole suite of potentially problematic technology.

Increased political division has a similar impact on the way in which people interpret fake news, where users are clearly seeking out and accepting information thats compatible with their prior biases, notes Northeastern political science and computer sciences professor David Lazer. However, its unclear just how much one drops their critical-thinking skills when encountering media that reinforces their worldview.

Certainly, weve seen an increased polarization in the publics opinions, and thats clearly one of the factors that may be at play with the spread of misinformation, Lazer says. Its quite plausible that the political polarization and spread of misinformation are going hand in hand, but thats an area of needed research.

Director of Northeasterns Lazer Lab, which conducts research on social influence and networks, Lazers studies focus primarily on the proliferation of misinformation on social media. In 2019, he co-authored a study on the prevalence of fake news on Twitter during the 2016 presidential election cycle.

Deepfake technology is also quite relevant to his studies, Lazer says, but there needs to be more research on the different types of misinformation, how they spread, and their psychological impact on consumers of media. The rise in political polarization and its impact on the consumption of media is also a high-priority area of study, he adds.

We can certainly say over the last 40 years there has been increased polarization of many kinds, and thats concerning, Lazer says.

Beyond the issue of users failing to question the deepfakes they come across if the content confirms their existing worldview, the technology poses other significant concerns.

One of the most problematic uses of the technology is when an individuals likeness, typically a womans, is manipulated and put on a sexually explicit video, making it appear as if the individual they are targeting is participating in the sexual activity, says Marc Berkman, the executive director of the Organization for Social Media Safety, a nonprofit dedicated to making social media safe through advocacy and education.

Additionally, as in the case of the deepfake of Zelenskyy, the world is witnessing the technologys political impact, Berkman says. Deepfakes can potentially interfere with democratic elections and be used as propaganda to sow division and doubt, he says.

Fallis and Berkman emphasize the importance of users cultivating critical-thinking skills when venturing online. One way for people to protect themselves against deepfakes is to engage in safe social-media use: Approach content, particularly news, with a critical eye.

The Organization for Social Media Safety is currently supporting media training in public schools, helping children understand news sources so they can take a non-partisan approach to evaluating and understanding the credibility of content.

Its incredibly important for our democracy to understand what is real and what is not, Berkman says. Limiting time on social media to healthy amounts is also important, so people can avoid deepfakes used for propaganda purposes.

However, Fallis and Berkman note, individual efforts cant replace structural change in businesses and governments aimed at combating the proliferation of this potentially dangerous technology.

Social-media giants, like Facebook, have adopted policies vowing to remove deepfakes from their platforms if they meet certain criteria, and some state governments, like Californias, have adopted laws imposing civil liability on creators of intentionally harmful deepfakes.

In California, Berkman says, his organization is working on getting a state law passed that would also impose criminal punishment on the creators of malicious pornographic deepfakes, with the hope that this kind of law expands to other states and that the federal government adopts similar legislation.

For media inquiries, please contact media@northeastern.edu.

See original here:

Deepfakes and Fake News Pose a Growing Threat to Democracy - Northeastern University

Related Posts