Within days of the January 6 Capitol insurrection, outgoing President Donald Trumps internet presence was in upheaval. Trumps social media accounts were suspended across Facebook, Twitter, YouTube, Instagram, Snapchat, Twitch, and TikTok.
The same was true for many of Trumps more extremist followers. Twitter suspended more than 70,000 accounts primarily dedicated to spreading the false right-wing conspiracy theory QAnon. Apple, Google, and Amazon Web Services banned the right-wing Twitter alternative Parler, effectively shutting down the site indefinitely (though its attempting to return) and relegating many right-wingers to the hinterlands of the internet.
Permanently revoking users access to social media platforms and other websites a practice known as deplatforming isnt a new concept; conservatives have been railing against it and other forms of social media censure for years. But Trumps high-profile deplatforming has spawned new confusion, controversy, and debate.
Many conservatives have cried censorship, believing theyve been targeted by a collaborative, collective agreement among leaders in the tech industry in defiance of their free speech rights. On January 13, in a long thread about the sites decision to ban Trump, Twitter CEO Jack Dorsey rejected that idea. I do not believe this [collective deplatforming] was coordinated, he said. More likely: companies came to their own conclusions or were emboldened by the actions of others.
Still, the implications for free speech have worried conservatives and liberals alike. Many have expressed wariness about the power social media companies have to simply oust whoever they deem dangerous, while critics have pointed out the hypocrisy of social media platforms spending years bending over backward to justify not banning Trump despite his posts violating their content guidelines, only to make an about-face during his final weeks in office. Some critics, including Trump himself, have even floated the misleading idea that social media companies might be brought to heel if lawmakers were to alter a fundamental internet law called Section 230 a move that would instead curtail everyones internet free speech.
All of these complicated, chaotic arguments have clouded a relatively simple fact: Deplatforming is effective at rousting extremists from mainstream internet spaces. Its not a violation of the First Amendment. But thanks to Trump and many of his supporters, it has inevitably become a permanent part of the discourse involving free speech and social media moderation, and the responsibilities that platforms can and should have to control what people do on their sites.
We know deplatforming works to combat online extremism because researchers have studied what happens when extremist communities get routed from their homes on the internet.
Radical extremists across the political spectrum use social media to spread their messaging, so deplatforming those extremists makes it harder for them to recruit. Deplatfoming also decreases their influence; a 2016 study of ISIS deplatforming found, for example, that ISIS influencers lost followers and clout as they were forced to bounce around from platform to platform. And when was the last time you heard the name Milo Yiannopoulos? After the infamous right-wing instigator was banned from Twitter and his other social media homes in 2016, his influence and notoriety plummeted. Right-wing conspiracy theorist Alex Jones met a similar fate when he and his media network Infowars were deplatformed across social media in 2018.
The more obscure and hard to access an extremists social media hub is, the less likely mainstream internet users are to stumble across the group and be drawn into its rhetoric. Thats because major platforms like Facebook and Twitter generally act as gateways for casual users; from there, they move into the smaller, more niche platforms where extremists might congregate. If extremists are banned from those major platforms, the vast majority of would-be recruits wont find their way to those smaller niche platforms.
Those extra hurdles added obscurity and difficulty of access also apply to the in-group itself. Deplatforming disrupts extremists ability to communicate with one another, and in some cases creates a barrier to continued participation in the group. A 2018 study tracking a deplatformed British extremist group found that not only did the groups engagement decrease after it was deplatformed, but so did the amount of content it published online.
Social media companies should continue to censor and remove hateful content, the studys authors concluded. Removal is clearly effective, even if it is not risk-free.
Deplatforming impacts the culture of both the platform thats doing the ousting and the group that gets ousted. When internet communities send a message of zero tolerance toward white supremacists and other extremists, other users also grow less tolerant and less likely to indulge extremist behavior and messaging. For example, after Reddit banned several notorious subreddits in 2015, leaving many toxic users no place to gather, a 2017 study of the remaining communities on the site found that hate speech decreased across Reddit.
That may seem like an obvious takeaway, but it perhaps needs to be repeated: The element of public shaming involved in kicking people off a platform reminds everyone to behave better. As such, the message of zero tolerance that tech companies sent by deplatforming Trump is long overdue in the eyes of many, such as the millions of Twitter users who spent years pressuring the company to ban the Nazis and other white supremacists whose rhetoric Trump frequently echoed on his Twitter account. But it is a welcome message nonetheless.
As for the extremists, the opposite effect often takes place. Extremist groups have typically had to sand off their more extreme edges to be welcomed on mainstream platforms. So when that still isnt enough and they get booted off a platform like Twitter or Facebook, wherever they go next tends to be a much laxer, less restrictive, and, well, extreme internet location. That often changes the nature of the group, making its rhetoric even more extreme.
Think about alt-right users getting booted off 4chan and flocking to even more niche and less moderated internet forums like 8chan, where they became even more extreme; a similar trajectory happened with right-wing users fleeing Twitter for explicitly right-wing-friendly spaces like Gab and Parler. The private chat platform Telegram, which rarely steps in to take action against the many extremist and radical channels it hosts, has become popular among terrorists as an alternative to more mainstream spaces. Currently, Telegram and the encrypted messaging app Signal are gaining waves of new users as a result of recent purges at mainstream sites like Twitter.
The more niche and less moderated an internet platform is, the easier it is for extremism to thrive there, away from public scrutiny. Because fewer people are likely to frequent such platforms, they can feel more insular and foster ideological echo chambers more readily. And because people tend to find their way to these platforms through word of mouth, theyre often primed to receive the ideological messages that users on the platforms might be peddling.
But even as extreme spaces get more extreme and agitated, theres evidence to suggest that depriving extremist groups of a stable and consistent place to gather can make the groups less organized and more unwieldy. As a 2017 study of ISIS Twitter accounts put it, The rope connecting ISISs base of sympathizers to the organizations top-down, central infrastructure is beginning to fray as followers stray from the agenda set for them by strategic communicators.
Scattering extremists to the far corners of the internet essentially forces them to play online games of telephone regarding what their messaging, goals, and courses of action are, and contributes to the group becoming harder to control which makes them more likely to be diverted from their stated cause and less likely to be corralled into action.
So far, all of this probably seems like a pretty good thing for the affected platforms and their user bases. But many people feel wary of the power dynamics in play, and question whether a loss of free speech is at stake.
One of the most frequent arguments against deplatforming is that its a violation of free speech. This outcry is common whenever large communities are targeted based on the content of their tweets, like when Twitter finally did start banning Nazis by the thousands. The bottom line is that social media purges are not subject to the First Amendment rule that protects Americans right to free speech. But many people think social media purges are akin to censorship and its a complicated subject.
Andrew Geronimo is the director of the First Amendment Clinic at Case Western Reserve law school. He explained to Vox that the reason theres so much debate about whether social media purges qualify as censorship comes down to the nature of social media itself. In essence, he told me, websites like Facebook and Twitter have replaced more traditional public forums.
Some argue that certain websites have gotten so large that theyve become the de facto public square, he said, and thus should be held to the First Amendments speech-protective standards.
In an actual public square, First Amendment rights would probably apply. But no matter how much social media may resemble that kind of real space, the platforms and the corporations that own them are at least for now considered private businesses rather than public spaces. And as Geronimo pointed out, A private property owner isnt required to host any particular speech, whether thats in my living room, at a private business, or on a private website.
The First Amendment constrains government power, so when private, non-governmental actors take steps to censor speech, those actions are not subject to constitutional constraints, he said.
This distinction is confusing even to the courts. In 2017, while ruling on a related issue, Supreme Court Justice Anthony Kennedy called social media the modern public square, noting, a fundamental principle of the First Amendment is that all persons have access to places where they can speak and listen, and then, after reflection, speak and listen once more. And while social media can seem like a place where few people have ever listened or reflected, its easy to see why the comparison is apt.
Still, the courts have consistently rejected free speech arguments in favor of protecting the rights of social media companies to police their sites the way they want to. In one 2019 decision, the Ninth Circuit Court of Appeals cited the Supreme Courts assertion that merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints. The courts generally reinforce the rights of website owners to run their websites however they please, which includes writing their own rules and booting anyone who misbehaves or violates those rules.
Geronimo pointed out that many of the biggest social media companies have already been enacting restrictions on speech for years. These websites already ban a lot of constitutionally protected speech pornography, hate speech, racist slurs, and the like, he noted. Websites typically have terms of service that contain restrictions on the types of speech, even constitutionally protected speech, that users can post.
But that hasnt stopped critics from raising concerns about the way tech companies removed Trump and many of his supporters from their platforms in the wake of the January 6 riot at the Capitol. In particular, Trump himself claimed a need for Section 230 reform that is, reform of the pivotal clause of the Communications Decency Act that basically allows the internet as we know it to exist.
Known as the safe harbor rule of the internet, Section 230 of the 1996 Communications Decency Act is a pivotal legal clause and one of the most important pieces of internet legislation ever created. It holds that No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
Simply put, Section 230 protects websites from being held legally responsible for what their users say and do while using said websites. Its a tiny phrase but a monumental concept. As Geronimo observed, Section 230 allows websites to remove user content without facing liability for censoring constitutionally protected speech.
But Section 230 has increasingly come under fire from Republican lawmakers seeking to more strictly regulate everything from sex websites to social media sites where conservatives allege they are being unfairly targeted after their opinions or activities get them suspended, banned, or censured. These lawmakers, in an effort to force websites like Twitter to allow all speech, want to make websites responsible for what their users post. They seem to believe that altering Section 230 would force the websites to then face penalties if they censored conservative speech, even if that speech violates the websites rules (and despite several inherent contradictions). But as Recodes Sara Morrison summed up, messing with Section 230 creates a huge set of problems:
This law has allowed websites and services that rely on user-generated content to exist and grow. If these sites could be held responsible for the actions of their users, they would either have to strictly moderate everything those users produce which is impossible at scale or not host any third-party content at all. Either way, the demise of Section 230 could be the end of sites like Facebook, Twitter, Reddit, YouTube, Yelp, forums, message boards, and basically any platform thats based on user-generated content.
So, rather than guaranteeing free speech, restricting the power of Section 230 would effectively kill free speech on the internet as we know it. As Geronimo told me, any government regulation that would force [web companies] to carry certain speech would come with significant First Amendment problems.
However, Geronimo also allows that just because deplatforming may not be a First Amendment issue doesnt mean that its not a free speech issue. People who care about free expression should be concerned about the power that the largest internet companies have over the content of online speech, he said. Free expression is best served if there are a multitude of outlets for online speech, and we should resist the centralization of the power to censor.
And indeed, many people have expressed concerns about deplatforming as an example of tech company overreach including the tech companies themselves.
In the wake of the attack on the Capitol, a public debate arose about whether tech and social media companies were going too far in purging extremists from their user bases and shutting down specific right-wing platforms. Many observers have worried that the moves demonstrate too much power on the part of companies to decide what kinds of opinions are sanctioned on their platforms and what arent.
A company making a business decision to moderate itself is different from a government removing access, yet can feel much the same, Twitters Jack Dorsey stated in his self-reflective thread on banning Trump. He went on to express hope that a balance between over-moderation and deplatforming extremists can be achieved.
This is by no means a new conversation. In 2017, when the web service provider Cloudflare banned a notorious far-right neo-Nazi site, Cloudflares president, Matthew Prince, opined on his own power. I woke up this morning in a bad mood and decided to kick them off the Internet, he wrote in a subsequent memo to his employees. Having made that decision we now need to talk about why it is so dangerous. [...] Literally, I woke up in a bad mood and decided someone shouldnt be allowed on the Internet. No one should have that power.
But while Prince was hand-wringing, others were celebrating what the ban meant for violent hate groups and extremists. And that is really the core issue for many, many members of the public: When extremists are deplatformed online, it becomes harder for them to commit real-world violence.
Deplatforming Nazis is step one in beating far right terror, antifa activist and writer Gwen Snyder tweeted, in a thread urging tech companies to do more to stop racists from organizing on Telegram. No, private companies should not have this kind of power over our means of communication. That doesnt change the fact that they do, or the fact that they already deploy it.
Snyder argued that conservatives fear of being penalized for the violence and hate speech they may spread online ignores that penalties for that offense have existed for years. Whats new is that now the consequences are being felt offline and at scale, as a direct result of the real-world violence that is often explicitly linked to the online actions and speech of extremists. The free speech debate obscures that reality, but its one that social media users who are most vulnerable to extremist violence people of color, women, and other marginalized communities rarely lose sight of. After all, while people whove been kicked off Twitter for posting violent threats or hate speech may feel like theyre the real victims here, theres someone on the receiving end of that anger and hate, sometimes even in the form of real-world violence.
The deplatforming of Trump already appears to be working to curb the spread of election misinformation that prompted the storming of the Capitol. And while the debate about the practice will likely continue, it seems clear that the expulsion of extremist rhetoric from mainstream social media is a net gain.
Deplatforming wont single-handedly put a stop to the spread of extremism across the internet; the internet is a big place. But the high-profile banning of Trump and the large-scale purges of many of his extremist supporters seems to have brought about at least some recognition that deplatforming is not only effective, but sometimes necessary. And seeing tech companies attempt to prioritize the public good over extremists demand for a megaphone is an important step forward.
Support Vox's explanatory journalism
Every day at Vox, we aim to answer your most important questions and provide you, and our audience around the world, with information that empowers you through understanding. Voxs work is reaching more people than ever, but our distinctive brand of explanatory journalism takes resources. Your financial contribution will not constitute a donation, but it will enable our staff to continue to offer free articles, videos, and podcasts to all who need them. Please consider making a contribution to Vox today, from as little as $3.
More here:
Why Trumps Twitter ban isnt a violation of free speech: Deplatforming, explained - Vox.com
- Bill Ackman says he's 'learned a lot' from Elon Musk's X - Quartz - April 22nd, 2024 [April 22nd, 2024]
- Opinion | Columbia, Free Speech and the Coddling of the American Right - The New York Times - April 22nd, 2024 [April 22nd, 2024]
- TikTok raises free speech concerns on bill passed by US House that may ban app - Voice of America - VOA News - April 22nd, 2024 [April 22nd, 2024]
- The Right Must Avoid the Left's Free Speech Pitfalls Minding The Campus - Minding The Campus - April 22nd, 2024 [April 22nd, 2024]
- Fear and loathing on America's college campuses as free speech is disappearing | Will Bunch - The Philadelphia Inquirer - April 22nd, 2024 [April 22nd, 2024]
- Harrison Ford Called 'Free Palestine' Supporters 'Force of Nature' in Speech? - Snopes.com - April 22nd, 2024 [April 22nd, 2024]
- Elon Musk to fund new First Amendment campaign to combat 'relentless attacks on free speech' - Fox News - April 22nd, 2024 [April 22nd, 2024]
- TikTok raises free speech concerns on bill passed by US House that may ban app - New York Post - April 22nd, 2024 [April 22nd, 2024]
- Navigating The Murky Waters Of Antisemitism, Free Speech, And Academic Freedom - Forbes - April 22nd, 2024 [April 22nd, 2024]
- The choice between safety and free speech is a false one - Daily Trojan Online - April 22nd, 2024 [April 22nd, 2024]
- AI chatbots refuse to produce 'controversial' output why that's a free speech problem - The Conversation - April 22nd, 2024 [April 22nd, 2024]
- UC Virtual Conference Centers Free Speech and Civil Rights Amid Ongoing Tensions on College Campuses - Diverse: Issues in Higher Education - April 22nd, 2024 [April 22nd, 2024]
- Free speech freeze-up | D.H. Robinson - The Critic - April 22nd, 2024 [April 22nd, 2024]
- Will Columbias law-school dean learn the law of free speech? - JNS.org - April 22nd, 2024 [April 22nd, 2024]
- OSU, OK State Regents for Higher Education complete first required free speech training - Daily O'Collegian - April 22nd, 2024 [April 22nd, 2024]
- TikTok uses free speech card to save itself from US ban, will it be enough? - Hindustan Times - April 22nd, 2024 [April 22nd, 2024]
- Settlement Reached in Free Speech Case at Temecula Valley Unified - ACLU of Southern California - April 22nd, 2024 [April 22nd, 2024]
- USC canceled its valedictorian speech: What the university got wrong. - Slate - April 22nd, 2024 [April 22nd, 2024]
- A free speech fiasco united the far-right here's why they remain divided - POLITICO Europe - April 22nd, 2024 [April 22nd, 2024]
- US TikTok Ban Bill Would 'Trample' On Free Speech Rights Of 170M Americans, Says Social Media Giant - Benzinga - April 22nd, 2024 [April 22nd, 2024]
- X marks the spot where free speech comes at a cost - Sydney Morning Herald - April 22nd, 2024 [April 22nd, 2024]
- Newly reinstated Texas Tech professor continues to advocate for free speech - KLBK | KAMC | EverythingLubbock.com - April 22nd, 2024 [April 22nd, 2024]
- Coin Center says Senate-presented stablecoin bill poses risks to innovation and free speech - crypto.news - April 22nd, 2024 [April 22nd, 2024]
- TikTok creators worry about free speech and income streams if ban succeeds: 'My livelihood is at stake' - CNBC - March 29th, 2024 [March 29th, 2024]
- As Texas students clash over Israel-Hamas war, Gov. Greg Abbott orders colleges to revise free speech policies - The Texas Tribune - March 29th, 2024 [March 29th, 2024]
- Opinion | The Debate Over Free Speech, Disinformation and Censorship - The New York Times - March 29th, 2024 [March 29th, 2024]
- VCU one of the top campuses in the country for free speech, advocacy group says - Axios - March 29th, 2024 [March 29th, 2024]
- Free Speech Is Under Attack in the U.S., but It's on the Ropes Elsewhere - Reason - March 29th, 2024 [March 29th, 2024]
- Free speech hangs in the balance in 3 Supreme Court cases - The Hill - March 29th, 2024 [March 29th, 2024]
- Free Speech Unmuted: Free Speech, Government Persuasion, and Government Coercion - Reason - March 29th, 2024 [March 29th, 2024]
- In crowded week for free speech, justices hear 3 First Amendment cases - Reporters Committee for Freedom of the Press - March 29th, 2024 [March 29th, 2024]
- Abbott Issues Guidance To Texas Colleges And Universities About Free Speech And Anti-Semitism - EastTexasRadio.com - March 29th, 2024 [March 29th, 2024]
- Gov. Abbott orders Texas universities to revise free speech policies to combat antisemitism - The UTD Mercury - March 29th, 2024 [March 29th, 2024]
- The Liberty Justice Center Urges the U.S. Supreme Court to Uphold Protections for Free Speech in Donor Disclosure ... - Liberty Justice Center - March 29th, 2024 [March 29th, 2024]
- GOP pushes anti-free speech bills to fight antisemitism - UnHerd - March 29th, 2024 [March 29th, 2024]
- PEN Union Cries Foul in Contract Talks as Criticism of PEN America Intensifies - Publishers Weekly - March 29th, 2024 [March 29th, 2024]
- CAIR-Texas Condemns Gov. Abbott's Anti-Palestinian Executive Order as Attack on Free Speech (Video) - - Council on American-Islamic Relations - March 29th, 2024 [March 29th, 2024]
- Dissent: When It Comes To Free Speech, the Editorial Board Is All Talk. | Opinion - Harvard Crimson - March 29th, 2024 [March 29th, 2024]
- Gov. Abbott calls for universities to update free speech policies, discipline violators to address antisemitism on campuses - The Daily Texan - March 29th, 2024 [March 29th, 2024]
- The Times Ed Board picks a confusing fight against the Emerald City Ride, free speech - Seattle Bike Blog - March 29th, 2024 [March 29th, 2024]
- Trump's Free Speech Defense on Trial in Georgia Election Interference Case - Hoodline - March 29th, 2024 [March 29th, 2024]
- Convicting Julian Assange Would Mean the End of Free Speech - The American Conservative - March 29th, 2024 [March 29th, 2024]
- Column: Banning TikTok is a blow to free speech - Redmond Spokesman - March 29th, 2024 [March 29th, 2024]
- Free Speech Is Under Such Threat In Canada It Would Make Orwell Blush - Forbes - March 29th, 2024 [March 29th, 2024]
- More on Coercion, Social Media, and Freedom of Speech: Rejoinder to Philip Hamburger - Reason - March 29th, 2024 [March 29th, 2024]
- OfS free speech guidance: time will tell if it builds understanding - The PIE News - March 29th, 2024 [March 29th, 2024]
- Kevin Rennie: Jaw-dropping attack on free speech and assembly in a CT town. It hurts us all. - Hartford Courant - March 2nd, 2024 [March 2nd, 2024]
- Chemerinsky: Navigating Free Speech on Campus, A First Amendment Perspective - The Collegian online - March 2nd, 2024 [March 2nd, 2024]
- Bentley Hosts Forum on Free Speech on College Campuses with Legal Expert Harvey Silverglate - Bentley University - March 2nd, 2024 [March 2nd, 2024]
- POLL: 69% of Americans believe country on wrong track on free speech - Foundation for Individual Rights in Education - March 2nd, 2024 [March 2nd, 2024]
- In Defense of Free Speech and the Mission of the University - Public Discourse - March 2nd, 2024 [March 2nd, 2024]
- Free Speech and Common Carriage: Unpacking the Supreme Court's Examination of the Texas and Florida Social ... - Public Knowledge - March 2nd, 2024 [March 2nd, 2024]
- So to Speak: The Free Speech Podcast | Free speech news: NetChoice, Taylor Swift, October 7, and Satan - Foundation for Individual Rights in Education - March 2nd, 2024 [March 2nd, 2024]
- Supreme Court to Decide How the First Amendment Applies to Social Media - The New York Times - March 2nd, 2024 [March 2nd, 2024]
- U.S. Supreme Court to weigh in on Texas social media law - The Texas Tribune - March 2nd, 2024 [March 2nd, 2024]
- Bill aimed at protecting free speech rights advancing in SC House - News From The States - March 2nd, 2024 [March 2nd, 2024]
- Champion of Free Speech and Journalism Margaret Talev Leads Institute for Democracy, Journalism and Citizenship ... - Syracuse University News - March 2nd, 2024 [March 2nd, 2024]
- Kinsey student says IU administrator infringed on free speech rights at demonstration - Indiana Daily Student - March 2nd, 2024 [March 2nd, 2024]
- CBS News boss who signed off on firing Catherine Herridge to get free speech award - New York Post - March 2nd, 2024 [March 2nd, 2024]
- U.S. Supreme Court to hear Texas and Florida cases about free speech and social media platforms - Texas Standard - March 2nd, 2024 [March 2nd, 2024]
- Should Honking Your Horn Be Considered Free Speech? - The Autopian - March 2nd, 2024 [March 2nd, 2024]
- Event: Free speech implications of the ICJ South Africa v. Israel case - ARTICLE 19 - Article 19 - March 2nd, 2024 [March 2nd, 2024]
- Takeaways From the Supreme Court Arguments on Social Media Laws - The New York Times - March 2nd, 2024 [March 2nd, 2024]
- Free Speech Unmuted: Book Bansor Are They? - Reason - March 2nd, 2024 [March 2nd, 2024]
- Florida anti-free speech bill targets 'liberal media' but guess who's really mad at it? - KeysNews.com - March 2nd, 2024 [March 2nd, 2024]
- Supreme Court arguments over social media laws and free speech are defining social media itself - Quartz - March 2nd, 2024 [March 2nd, 2024]
- Canadian measure would remove free speech protection for quoting Bible, sacred texts - Washington Times - March 2nd, 2024 [March 2nd, 2024]
- Suffield scraps plan to restrict the use of the town green following pushback from free speech advocates - FOX61 Hartford - March 2nd, 2024 [March 2nd, 2024]
- FIRST PERSON: Free speech fails for Zionists at UC Berkeley - The Jewish News of Northern California - March 2nd, 2024 [March 2nd, 2024]
- Menard Center and pre-law club host discussion regarding AI and Free Speech - UWEC Spectator - March 2nd, 2024 [March 2nd, 2024]
- Supreme Court arguments over future of social media and free speech - WFXRtv.com - March 2nd, 2024 [March 2nd, 2024]
- Judge skeptical of lawsuit brought by Elon Musk's X over hate speech research - NPR - March 2nd, 2024 [March 2nd, 2024]
- Israeli philosopher Yoram Hazony lectures on free speech, antisemitism while students hold vigil - Observer Online - March 2nd, 2024 [March 2nd, 2024]
- Big Tech fights Texas and Florida at SCOTUS, and Brett Kavanaugh might be the one saving the internet as we know it. - Slate - March 2nd, 2024 [March 2nd, 2024]
- Biden Is Trying to Balance Gaza Protests and Free Speech Rights as Demonstrators Disrupt His Events - U.S. News & World Report - January 29th, 2024 [January 29th, 2024]
- British Universities Are Repressing Free Speech on Palestine - Jacobin magazine - January 29th, 2024 [January 29th, 2024]
- The Future of Academic Freedom - The New Yorker - January 29th, 2024 [January 29th, 2024]
- "College Is All About Curiosity. And That Requires Free Speech." - Reason - January 29th, 2024 [January 29th, 2024]
- Palestine and the crisis of free speech on college campuses - The Real News Network - January 29th, 2024 [January 29th, 2024]
- College Is All About Curiosity. And That Requires Free Speech. - The New York Times - January 29th, 2024 [January 29th, 2024]