Facebook spread rumors about arsonists setting fires in Oregon. It’s part of their business model. – NBC News

As wildfires were burning across Oregon and California this week, conspiracy theories about how the fires started were moving nearly as rapidly on Facebook. Posts falsely blaming members of antifa or Black Lives Matter spread across the platform nearly unchecked, causing calls about antifa arsonists to clog emergency phone lines. Local and national law enforcement had to spend precious time and resources rebutting the false claims, instead of rescuing residents and aiding in evacuations.

Facebook said last Saturday that it was banning fire-related conspiracy talk from the platform. But, according to research by the German Marshall Fund of the United States, the misinformation continued to circulate for days afterward, eluding whatever mechanisms Facebook had put in place to end it.

Facebook had time to prepare for such a contingency; it is certainly not the first time the company has been called upon but unable to quell conspiracy-mongering around major national events. For example, following the killing of George Floyd in May and the ensuing protests, Facebook posts falsely alleging that Floyds death had been faked, or that the entire protest movement was organized by the CIA, were being spread on the platform. Facebook pledged to crack down on the spread of vile nonsense, but its efforts, never made fully transparent, were similarly ineffective.

Why does Facebook find itself, over and over, unable to cope with the exploitation of its platform to spread conspiracies, misinformation and propaganda? Because sensationalized content is how Facebook makes money. So, until its business model changes, the problems it aids wont stop.

Get the think newsletter.

Facebook turns a profit by surveilling its users and monetizing their attention: It uses its huge user base and its dominance of social networking both through its main site and others that it owns, such as Instagram to collect a massive amount of data, which is sold to advertisers. In other words, Facebook and Instagram are only free services to you because you are the product Facebook and Instagram sell to other entities.

The longer you stay on Facebook, the more ads Facebook shows you, and the more money it pockets from various advertisers for having shown you those ads. The way to get you to stay on Facebook longer, scrolling through your feed, is to hook you on addictive content, which enables Facebook to collect ever-more data that can be sold to third parties a vicious cycle that entrenches its dominance.

It turns out that, far more than jealousy-inspiring vacation photos or family updates from long-lost high school friends, the most addictive content for users is sensationalized versions of the news, some of which involves conspiracy theories and memes. So rather than having an incentive to eliminate conspiracy-laden content, Facebook is incentivized to keep serving it to you in the interest of profiting off of your attention.

It is then no coincidence that people who wind up joining extremist groups on Facebook do so because Facebooks algorithms suggest doing so, according to Facebooks own internal investigation. Facebooks programming knows such people are likely to get hooked by that sort of organization and thus use Facebook more than ever and so it helps reel them in.

And so, as with many of the problems Facebook causes, potential solutions run up against its profit motive, with the latter winning out.

Expecting Facebook to then solve a problem that is an inherent part of its business model is akin to expecting poachers to implement measures to protect rare wildlife: Doing so would put them out of business.

Adding insult to injury, the local news sources that might have been able to spring into action and rebut some of whats peddled on Facebook have been decimated by Facebook (and Google) both charging them to reach their own readers and hoovering up the bulk of the available digital ad dollars, leaving the local journalism industry a shell of its former self. Hyper-partisan web sites (which are often more willing to buy ads to reach eyeballs) masquerading as local news have instead taken their place and extended their audience via Facebook another feedback loop making it ever-more difficult for people to access good local information.

While this problem is most acute during crises, it plays out all the time in ways large and small. In Holyoke, Massachusetts, for instance, the decline of the local journalism industry led to the loss of several papers. So, when a ballot initiative was put forward in 2019 that would have approved a bond issue for new middle schools, information about it was spread by commenters in a Facebook group instead of by local reporters; locals received misinformation about the citys finances instead of the truth.

As one city employee told me, officials wound up playing whack-a-mole with every little conspiracy theory out there, instead of spending their time thinking about fixing potholes or opening health clinics.

The response to this repeated, ongoing problem has generally been for politicians, regulators and enforcement agencies to politely request that Facebook change its ways. But that will never be good enough because, until the way in which Facebook makes its money changes, Facebook will not change.

There has been some encouraging movement on that front recently, though: The Wall Street Journal reported this week that the Federal Trade Commission is considering opening an antitrust case against Facebook. And, in July, the antitrust subcommittee of the House Judiciary Committee held a major hearing on the market power of big tech platforms, including Facebook, and will be issuing recommendations on how to address them soon.

In the end, though, truly making social media safe for democracy means breaking up Facebooks many businesses, which include Facebook, Instagram and WhatsApp, and then changing the way it makes money by both altering its liability for user content and changing its ability to microtarget advertising to specific users. Until such structural and regulatory changes are made, Facebook conspiracies will continue to circulate, and the world will continue to be a more dangerous place in which to live, work and vote.

Read more:

Facebook spread rumors about arsonists setting fires in Oregon. It's part of their business model. - NBC News

Related Posts

Comments are closed.