All the news that’s not fit to print: EU increases pressure on platforms to tackle ‘fake news’

Francine Cunningham, Senior Public Affairs Manager in our Brussels office, provides an overview of the EU strategy to tackle Online Disinformation

0
507

In the countdown to the European Parliament elections in May, the corridors in Brussels and Strasbourg echo with anxious talk not only about populist candidates and electoral colleges, but about “social bots”, “deep fakes” and “troll farms”. The challenge of fighting online disinformation is high on the agenda of the EU institutions and Member States, concerned about potential manipulation of public opinion ahead of the elections. According to a Eurobarometer opinion poll, 83% of Europeans regard fake news as a threat to democracy. Mounting concern about the impact of online misinformation has also raised the potential of future regulatory intervention regarding the role of online platforms.

European Commission President Jean-Claude Juncker awarded the relatively new Commissioner for Digital Economy & Society, Mariya Gabriel, a much sought-after prize in 2017 when she was given the responsibility of overseeing an EU strategy to combat online disinformation. During 2018, the Commission worked with a High Level Group of experts, including representatives of publishers, advertisers and online platforms, to develop a self-regulatory code of practice that was released on 26th September 2018.  The Code has been signed by four technology companies, namely Google, Facebook, Twitter and Mozilla, as well as trade associations representing the advertising industry.

This initiative follows on from the Communication “Tackling online disinformation: a European approach” presented in April 2018. In parallel, the EU executive has worked with national capitals to create a strategy to counter mass online disinformation campaigns intended to polarise public opinion or sow distrust in institutions, especially in relation to national elections and the forthcoming European Parliament election.

For the purposes of the Code, the Commission has defined “disinformation” as “verifiably false or misleading information which (a) is created, presented and disseminated “for economic gain or to intentionally deceive the public”; and (b) “may cause public harm” intended as “threats to democratic political and policymaking processes as well as public goods such as the protection of EU citizens’ health, the environment or security.” The Commission has clarified that this definition does not include misleading advertising, reporting errors, satire, parody or clearly identified partisan news and commentary. It is also without prejudice to binding legal obligations, self-regulatory advertising codes and standards regarding misleading advertising.

The Code itself a set of non-binding guidelines that calls on technology platforms to provide monthly reports to increase transparency around, among other things, who is behind the spread of disinformation via “bots” (a programme on the internet that performs a repetitive function) and “troll farms” (an organisation which posts false accusations or inflammatory remarks in order to disrupt online communities).

Its approach is based on five pillars:

  • increasing scrutiny of advertisement placements, notably to reduce revenues to the purveyors of disinformation;
  •  ensuring transparency about political advertising and issue-based advertising;
  •  strengthening the integrity of services by intensifying efforts to close fake accounts and safeguard against misrepresentation;
  • empowering consumers by diluting the visibility of disinformation and improving the findability of trustworthy content; and
  • empowering the research community by taking measures to cooperate by providing access to data for fact-checking and research activities.

The importance of being viral

On 29th January 2019, the Commission ratcheted up pressure on technology and social media companies to increase their efforts to tackle the circulation of misinformation, or else face the prospect of regulatory intervention. The threat of future legislation came as the Commission evaluated the first report from technology companies outlining the progress made in implementing the self-regulatory Code. “You can’t fight fire with water pistols. It’s time for platforms to show they’re bringing more than water pistols,” said Security Commissioner Julian King at a press conference.

According to the Commission’s evaluation, the four technology companies that are signatories to the Code have taken measures to meet their commitments, with work advanced in some areas such as taking down fake accounts, transparency of political ads or de-monetising some purveyors of disinformation. However, the Commission has stated that more significant progress is required with respect to other essential parts of the Code, such as transparency of issue-based ads, or cooperation with fact-checkers and the research community.

In addition, the Commission has requested national authorities to monitor disinformation and share the results of their research with other EU countries via a “rapid alert system” to be established by March 2019. This would act as an early warning system for Member States and media with respect to emerging waves of fake news circulating online.

Nothing but the post-truth

While the actions recommended by the Code are voluntary, they have not been entirely free from controversy.  Some criticism has already been voiced about the possibility of such guidelines being invoked by governments keen to limit free speech in order to avoid public criticism. In addition, the EU’s External Action Service’s Strategic Communications Division (“StratComms”), has already faced criticism for what some regard as overstepping its role in the flagging of content, for example in the Netherlands. Commissioner King has countered that the initiative is about shining a light on the provenance of news, rather than judging if particular information is true or false.

Mindful of potential accusations of censorship, some Member States are emphasising the importance of counter-disinformation initiatives remaining within the scope of awareness-raising and media literacy activities. Nevertheless, the Commission is clearly waving the stick of potential proposal for regulation by the end of this year, if the non-regulatory approach does not yield sufficient results.  

The threat of regulation puts real pressure on the online platforms and advertising industry representatives to ensure the effectiveness of the measures taken to fight disinformation. Ahead of the European elections, the companies and trade associations which signed the Code have committed to providing regular and comprehensive information on a monthly basis, outlining how they are implementing the commitments to which they subscribed. In addition, since January 2019, signatories are expected to reply to the Commission’s specific requests. This potentially increases the administrative burden on the companies and trade associations involved.

An ongoing bone of contention for the Commission is likely to be the issue of access to data for fact-checkers and academic researchers. The Commission has already provided EUR 1 million to support a project for a Social Observatory for Disinformation and Social Media (SOMA), which started its work in November 2018 and is now set to provide additional funding of EUR 2.5 million. However, the Commission’s urging of technologies companies to furnish better access to data for fact-checkers and researchers is likely to come up against company security policies and trade secrets.

MediaWrites will keep an eye on how the developments play out.

Francine is the Regulatory & Public Affairs Director, based in Bird & Bird's Brussels office. With 20 years' experience of working in the EU and U.S. regulatory environment, in both the private and public sectors, Francine has extensive knowledge of the European regulatory environment for the media, technology and communications sectors. At a time of enormous regulatory challenges that will define the future of the digital economy, she advises clients how to navigate complex EU decision-making processes to achieve specific industry goals.

Leave a Reply