7 Expert Takeaways As the Supreme Court Considers Government Influence on Content Moderation – Just Security

Posted: March 18, 2024 at 11:33 am

(Editors note: Listen to a Just Security Podcast episode of the expert panel here and watch the panel discussion on Just Securitys YouTube channel here.)

Recently, public debates over the treatment of misinformation and disinformation related to issues such as the COVID-19 pandemic and federal election administration have spilled over to the legal realm. One central question revolves around to what degree the government can persuade social media companies to alter their content moderation decisions and when those efforts become so coercive as to violate the First Amendment. The debate over what is often termed jawboning will come before the Supreme Court, which will hear arguments in Murthy v. Missouri on March 18.

In the case, a group of social media users, along with Louisiana and Missouri, sued the Biden administration in July 2023. They alleged that officials across the federal government coerced social media platforms to censor accounts and content that cast doubt on the safety and effectiveness of COVID-19 vaccines. The government has argued that its actions were permissible and did not amount to coercion.

Also at issue in the case is whether plaintiffs have standing, or the ability to sue in federal court. The plaintiffs argue that there was a causal and temporal link between the governments actions and those of social media companies that affected content posted by individual plaintiffs and state officials. Furthermore, they argue that citizens and states have a First Amendment right to receive information and ideas.

The government argues that (1) individual plaintiffs have not demonstrated that platform actions were traceable to the government and that past incidents rather than the immediate threat of repeated injury would not establish standing to seek prospective relief; and (2) states lack standing because they lack First Amendment rights (regarding the moderation of content posted by state officials), nor do they possess a right to listen to their citizens on social media.

On July 4, 2023, a federal district court judge issued a broad injunction prohibiting federal government officials from many forms of communication with social media companies. The Fifth Circuit subsequently upheld and narrowed the injunction to prohibit government actions that coerce or significantly encourage social media platforms to suppress certain content. On Oct. 20, 2023, the Supreme Court stayed this injunction and agreed to hear the case.

Earlier this month, Just Security and the Reiss Center on Law and Security at NYU School of Law co-hosted a panel of experts with experience in government lawyering, private platforms, and free speech advocacy to discuss Murthy and its ramifications for the modern digital public square. Moderated by Professor Ryan Goodman, the panel consisted of Jameel Jaffer, the Executive Director of Knight First Amendment Institute at Columbia University and Executive Editor of Just Security; Kathryn Ruemmler, the Chief Legal Officer and General Counsel of Goldman Sachs and former White House Counsel to President Barack Obama; and Colin Stretch, the Chief Legal Officer and Corporate Secretary of Etsy and former General Counsel of Facebook (now Meta). The panelists discussed topics including the ramifications of Murthy on content moderation writ large; the roles and interests of the government, social media companies, and social media users in public discourse; definitions of government coercion; and related issues.

Here are seven takeaways from the remarks delivered by the panelists:

According to Jaffer, while this case had a particular partisan valence with Republican-leaning social media users suing a Democratic administration over content related to COVID-19 and election integrity, the next case may be presented differently.

He posed a hypothetical situation in which the Trump administration attempted, in the summer of 2020, to persuade social media companies to take down speech supportive of the Black Lives Matter movement. What would have been the reaction had the Trump administration made concerted efforts, including private communications and public statements by then-President Donald Trump claiming that social media companies were killing people as President Biden commented on platforms hosting COVID-19 misinformation in July 2021 by not taking down what he considered to be incendiary and violent speech? Jaffer pointed to other issues including the Dobbs decision and the Israel-Hamas war where there have been speech-related controversies: I worry whether we can cabin the rules [around jawboning] to [just] the public health context it is especially important that government speech be subject to real checks and counterweights.

While Jaffer reiterated the importance of a principled approach that prevents abuses of power and extends beyond the facts and partisan stakes of the Murthy case, Ruemmler highlighted the unique nature of the COVID-19 pandemic and said that the government was fulfilling its job to protect the health, safety, and welfare of its citizens, and the only way to get the pandemic under control was to get to the hearts of the citizenry, including through social media. Stretch said that most cases of moderating content that has the potential of offline harm do not have a political lens and do not have strong advocates against the contents suppression, as in the case of content promoting child sexual abuse or terrorism.

While Ruemmler took issue with the specific phrasing of some instances of government communications to social media companies, she argued that many of the comments from the White House were not nearly as threatening as portrayed, such as then-Press Secretary Jen Psakis reiteration of President Bidens support for antitrust and transparency reforms as well as potential reforms to Section 230 reforms to revise its liability shield for social media companies. Rummler said, If you have any appreciation for where real enforcement power lies, then youd know that White House digital strategists have zero influence over agencies with real regulatory authority, and that any reform in this space must be drafted and passed by Congress.

Drawing on his experience at Meta (formerly known as Facebook), Stretch argued that because social media companies are making decisions in many areas such as public health and child safety where they lack expertise, companies want the ability to communicate with government and civil society experts to inform their content moderation policies. Because the scale of content that these companies are hosting is huge and mindboggling, they would often have outside actors like the government and civil society groups flagging content that allegedly violated the platforms policies. Rather than being overridden or coerced, these companies exercised independent judgment. Likewise, Ruemmler said that the record shows the willingness of social media companies to be engaged in conversation about ways to combat the pandemic.

Jaffer countered that social media companies host so much content that they necessarily do not care very much whether particular content stays up and therefore are incentivized to comply with government requests. Furthermore, he argued that social media companies often follow their competitors and operate in a cartel-like manner, which threatens editorial diversity in the digital public sphere.

Jaffer acknowledged legitimate interests on both sides. On the one hand, there is the interest of the American people in having a government that can effectively govern, including the power to speak. It can be legitimate to try to persuade private speech intermediaries to be more attentive to what [the government] says is the public interest. It is also legitimate, as Stretch mentioned, for the government to share information and expertise that no one else possesses, for instance, public health data from the Centers for Disease Control and Prevention. On the other hand, Stretch said that social media companies and users have an interest in maintaining expressive spaces that are free from government coercion and reflect autonomous editorial decisions. Both are important interests, and the law can balance those two interests by drawing a distinction between persuasion and coercion.

Jaffer distinguished large and powerful social media companies from smaller, less sophisticated entities, for example, a local LGBTQ bookstore that has an expressive interest that social media companies do not necessarily have. Even then, he noted the risk of what Daphne Kellers refers to as anticipatory obedience, whereby regulated entities shape their conduct to avoid adverse reactions from the government. While a test for coercion might ask whether there have been changes in content moderation policy in response to purported jawboning, Jaffer is not sure that it is possible to determine that government pressure was dispositive to an editorial decision that may have had multiple motives.

Ruemmler argued that, on the facts of the case, there was no indication that companies felt coerced, as they are some of the most powerful and sophisticated companies in the world and employ multiple former government officials: These are companies that understand how government and the world works; they are not individual citizens. Stretch agreed, saying that these big companies dont get scared easily, often face pressure from governments all over the world, and often feel empowered to push back against government pressure and criticism. Companies understand that heated political rhetoric is part of life in the big leagues.

However, when asked about the possibility of government persuasion becoming routinized and received by less-sophisticated middle management, Stretch countered that social media companies have routine communications with many groups, of which the government had no pride of place, even in national security matters. Additionally, in politically-charged cases, advocates adversely affected tend to be very vocal, which helps prevent any inevitable creep of acquiescence to government requests.

Jaffer drew a distinction between public and private government communications, with the latter posing a greater threat: If Biden weighs in publicly, others can push back. If the White House privately emails Facebook with a request to take down content accompanied by a threat, there will be no pushback because no one will know this communication exists. He questioned why the governments ability to send private emails to private corporations should be protected and stated his preference for mandating transparency around these communications.

Stretch agreed, saying that transparency would address many concerns regarding jawboning. Many of the requests from foreign governments to take down content are really problematic, saying this person is a terrorist when theyre actually a political opponent. According to Stretch, there are few benefits to keeping government communications private; instead, it would be healthier to increase transparency.

Ruemmler clarified that the government was sending emails to intermediary platforms, not individual speakers, who do not have the right to publish on those platforms under the First Amendment; rather, they only have the right to publish consistent with platform policies.

Jaffer agreed that users lacked a constitutional right to publish on a platform like Facebook, but clarified that they do have the right to publish to the extent that Facebook wants and the right not to have that relationship distorted by the government. For those who are skeptical as to whether social media companies are sufficient proxies for their users, focusing myopically on intermediaries is not enough. There needs to be a focus on the interests and rights of users, Jaffer said. He pointed to the NetChoice cases in which laws passed by Texas and Florida purporting to protect the interest of social media users are being challenged before the Supreme Court:

Even justices skeptical of these laws seem sympathetic to the idea that we may need to put in place protections to ensure that a handful of social media companies that have become gatekeepers of the digital public sphere are actually representing the views and interests of their users.

Stretch reiterated that most cases of content moderation lack a partisan valence and lack advocates against suppression, as in cases of terrorism and child safety:

For years, companies prohibited registered sex offenders from having Facebook and YouTube accounts. People who had paid their debt to society were effectively locked out of the digital world despite there never having been a law mandating this exclusion, solely as the result of a particular state Attorney General poking companies. There was no process, and no one argued that this disability that the state Attorney General was trying to force on companies was overbroad. This resulted in every company adopting the policy and keeping many people offline. At the end of the day, whos going to stand up for registered sex offenders? Similarly, with controversial content related to terrorism, whos going to stand up on the side of speech?

However, because of its politically-charged nature, Murthy does not lack strong support and compelling arguments on either side. When the Supreme Court begins hearing arguments on March 18, it will likely consider many of these issues.

Listen to the podcast episode by clicking below.

Biden administration, Big Tech, constitutional law, content moderation, Disinformation, First Amendment, freedom of expression, freedom of speech, Knight First Amendment Institute, Misinformation, Murthy v Missouri, Social Media Platforms, Supreme Court, Twitter

Link:
7 Expert Takeaways As the Supreme Court Considers Government Influence on Content Moderation - Just Security

Related Posts