Event Report: Internet Regulation, Responsibility and Safety – our reporter attends the Westminster eForum Keynote Seminar

How important is a robust policy in ensuring internet safety? What are the practicalities of implementation? What role should providers be playing? Jane Goodacre of Bird & Bird London reports on the hot topics discussed at the Westminster eForum Keynote Seminar on Internet Regulation, Responsibility and Safety.

0
700

Optimistic outlook?

When talking about the challenges the internet is facing, the inventor of the World Wide Web, Tim Berners-Lee, has recently been quoted in the press as saying that he is “an optimist, but an optimist standing on top of the hill with a nasty storm blowing in my face, hanging onto a fenceā€¦ We have to grit our teeth and hang on to the fence and not take it for granted that the web will lead us to wonderful things.” That sentiment was certainly reflected at the Westminster eForum Keynote Seminar.

There’s no denying the good that comes from the internet, but, as set out in the opening remarks, there is a growing sense that we have been ā€˜doingā€™ the internet wrong.Ā  Whilst the majority of the diverse panellists agreed on the need for a change in approach as to how the internet is regulated (i.e. whether or not to shift away from the current practice of self-regulation), the precise nature of any change remains a topic of much dispute.

The current state of play

The seminar was run at a pivotal point in the conversation about internet regulation and responsibility taking place within the UK government and EU institutions. A few of the key discussions ongoing in this area are:

  • The Communication from the European Commission released in September 2017 calls for online platforms to step up the fight against illegal content online. The Commission is in dialogue with online platforms and stakeholders, and is monitoring and assessing whether additional measures are needed, including possible legislative measures. The Commission estimates that its review will be complete by May 2018.
  • The Internet Safety Strategy Green Paper published in October 2017 by the department for Digital, Culture, Media and Sport considers the responsibilities of companies to their users, the use of technical solutions to prevent online harms and government’s role in supporting users. The accompanying public consultation closed in December 2017 and the government plans to issue a response in 2018.
  • Oral evidence was provided to the Digital, Culture, Media and Sport Select Committee in respect of its ongoing enquiry into fake news in December 2017 and January 2018.
  • The Electoral Commission continues its investigation into campaign activity during the EU Referendum. Part of the Commission’s work is focused on online platforms, such as Facebook and Twitter.

Key themes from the discussion

Whilst no clear consensus was reached in respect of who should take responsibility, for what and how that should work in practice, some key themes did emerge from the discussions:

  1. The public have lost trust in the internet, which is a big problem for individual users and businesses alike. Regaining trust is the shared problem of all stakeholders in the internet and high on the list of priorities for platforms who survive on user interaction.
  2. The lack of transparency as to what is being taken down and the criteria for any such take-down only compounds the trust deficit we are facing. It isn’t just an issue in respect of platforms’ algorithms and processes; there is also a lack of clarity in respect of the criteria used by bodies such as the Counter Terrorism Internet Referral Unit (CTIRU) in taking down content (not helped by the fact that CTIRU has recently refused a Freedom of Information request).
  3. There is no “one size fits all” solution when it comes to removing illegal and harmful content from the internet. Whilst it may be easy to identify child pornography at a glance, to determine whether content contains hate speech may require a look into the context of a post. Different processes are required for different types of content take down.
  4. There is a call for clear legal definitions when it comes to deciding what constitutes harmful content and what should therefore be removed. Some commentators advocate for policy makers, who are informed by the public’s concerns, to provide those definitions so that it isn’t left in the hands of platforms.

So, what does a solution look like?

The direction of travel is certainly moving towards further policy in this space in a bid to enlighten all members of the food chain as to:

  • what is and isn’t acceptable online;
  • what is being done about harmful content and who is doing it; and
  • who is checking it has been done correctly.

However, policy makers need to tread carefully as the internet is a delicate ecosystem with an array of different rights which all hang in the balance. This is an extremely difficult area in which to legislate, but at the same time there is growing pressure to change the current regulation processes. There is going to be change in this area in 2018, so watch this space.

 

 

Leave a Reply