False commentary and blinkered perspectives hampering TCO progress – EURACTIV

Never before has a situation demonstrated such a broad consensus for taking extra measures to regulate content online as the coronavirus has. At the same time, digital rights defenders and supportive MEPs are downplaying the use of automated tools in the fight against online terrorist content. EDRi (the digital rights advocacy group), with the support of the German Pirate MEP Patrick Breyer, have helped perpetuate myths surrounding the negotiations on the proposal to remove terrorist content online (TCO).

Breyer, the shadow rapporteur on the file for the Green Party, has taken to painting a misleading picture of the Council and European Commissions approach to content moderation and the terms being negotiated on the proposal to remove terrorist content online. Arguing against the deployment of automated tools to cut off the spread of harmful content online at the source in the noble pursuit of protecting the free internet is misleading. The internet is not free. What we see, and what influences our daily lives, is largely controlled by algorithms created and managed by digital platforms focused on their own revenue stream. At CEP, we continue to demonstrate the need for automated tools with human verification in order to ensure a safe and secure internet. But as a first step, how about an honest and open discussion about the situation as it stands and the obstacles we face in striving for an EU-wide response?

The proposal on preventing the dissemination of terrorist content online is stalled around two major sticking points the use of automated tools and cross border removal orders.

Despite the fact that the Council agreed that all tools deployed need to be proportionate to the problem, the size and capacity of the platform and subject to human review, as well as a Commission-funded Voxpol[1] study which proved the success of such tools, the European Parliament continues to insist without proposing any alternative solution that there should be no obligation to use automated tools.

The second issue causing undue delay is that the European Parliament cannot agree to cross-border removal orders, demanding that the host Member State is the only one that can issue orders against companies in their territory. De facto, the European Parliament seems to be of the view that Ireland should become the terrorist content police of the European Union. Interestingly, the Member States themselves agreed to cross-border removal orders thereby ensuring that terrorist content is really tackled by an efficient European solution based on mutual trust. So whose rights are the European Parliament protecting if even the Member States themselves are in agreement? The citizens are certainly not better protected if they exclusively have to defend their rights in Ireland.

The negotiations on these aspects of the file have unfortunately fallen foul of the reality of COVID-19. It seems that certain MEPs wish to slow the discussions down by refusing to interact with peers virtually and, instead, focus on peddling false narratives from their blogs. Claims that these issues will be more effectively dealt with under the new Digital Services Act just smack of delay tactics. Continuing to push this proposal down the priority list is an insult to victims of terrorism and their families all over Europe.

In Germany, recent CEP research on the notice and takedown procedures under the NetzDG found that the flagging systems deployed by platforms like Facebook and Instagram, to identify and remove terrorist content online were ineffective. CEPs snapshot analysis, which was conducted over a fortnight, showed that only 43% of the flagged content was blocked or removed. Of the 93 evidently illegal content items that were reported, 24 were blocked under the NetzDG process and 16 were removed under the platforms community guidelines.

Flagging systems are therefore not the most effective solution. We need EU wide removal orders to ensure companies are obliged to take down illegal content. Alongside this, companies need to deploy all measures in their toolbox, including automated tools with human verification, to keep users safe.

Now is not the time to delay and thus we urge MEPs to seize the opportunity to inform the debate on platform liability. CEP has extended an invitation to all MEPs working on the TCO and DSA to discuss the lessons that could be learned from the NetzDG and beyond. Now is the time to remain unified in showing the same transparency, oversight and accountability both CEP and MEPs would like from platforms.

[1] Vox-pol has studied the various tools used for content moderation, including human review and the use of automated tools, and finds that despite some challenges, automatic content detection tools are allowing more content to be removed, more quickly and with greater coverage. As a result, platforms using such tools have become a much more difficult place for terrorist organisations to operate.https://www.voxpol.eu/download/vox-pol_publication/DCUJ770-VOX-Extreme-Digital-Speech.pdf

See more here:

False commentary and blinkered perspectives hampering TCO progress - EURACTIV

Related Posts

Comments are closed.