Services in scope
Protection of children is a key objective behind this legislation. The JSC considered whether the current draft goes far enough in ensuring that areas of the internet which pose the highest risks to children are sufficiently regulated. It concluded that the services currently in scope of the draft OSB (user-to-user services and search engines) would not necessarily catch some of the highest-risk websites out there, such as pornography sites. It therefore recommended that the definition of in-scope services be amended to all services which are likely to be accessed by children.
The test to be applied when considering whether a service is “likely to be accessed by children” was also scrutinised. In particular, comparisons were drawn between the draft OSB’s wording and the ICO’s Age Appropriate Design Code (“AADC”) wording to determine the same issue, and the JSC concluded that there should be closer alignment between the draft OSB and the AADC.
Content in scope
The JSC’s report recommended that references to harmful “content” in the draft OSB should be amended to “regulated content and activity” and to focus on how the design of platforms, not just the content on them, can be harmful. Harmful designs of platforms could include algorithms which create “rabbit holes” that encourage users to inadvertently use the service excessively.
The JSC has made a number of recommendations aimed at removing uncertainty in determining what is or is not to be treated as “illegal content” in the draft OSB. In addition to making clear that any criminal offence can be committed online (including hate crime offences and extreme, illegal pornography), it supports the introduction of various new criminal offences in the draft OSB. These include cyberflashing, promotion of serious self-harm and sending threatening communications.
The report also recommends the introduction of a further new criminal offence, that of disinformation relating to election material, and concludes that “fraud offences” (online/investment/romance scams and fraudulent advertising) should also come within the definition of priority illegal content, to ensure services take proactive steps to address this.
The JSC highlights issues with section 11 in the draft OSB, which focused on safety duties around content which is legal but harmful to adults. The JSC recommends removing this section and suggests replacing it with a duty on service providers to have in place “proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities”. This would not just be focused on harm identified in the online world, but would also ensure that offline harms, such as abuse and harassment, are covered. Extending the protection to cover illegal content already regulated offline would ensure that any recognised interference in freedom of expression would be covered so there is no unregulated abuse online.
On the topic of journalistic content, to strengthen the protection for freedom of expression, the JSC recommends that news publisher content should not be “moderated, restricted or removed” unless it is illegal. There should also be protection for content of “democratic importance” to make sure content is judged on merit and freedom of expression is not curtailed unnecessarily.
More details on duties to tackle content
Under the current wording of the draft OSB, the only Codes of Practice which Ofcom is specifically obliged to produce in relation to illegal content are for terrorism and child sexual abuse and exploitation content. The JSC, recognising the challenges posed to services in identifying and tackling all other forms of priority and non-priority illegal content (including, but not limited to, the need to balance any actions they take against freedom of expression and to ensure steps taken are “proportionate”), has recommended that Ofcom produce a binding Code of Practice to assist services in how they should approach this obligation.
The report addresses the topic of online anonymity, in particular issues surrounding the lack of traceability, risks of online abuse and the risk of accounts being created for illegal or harmful activities. It recommends the implementation of a Code of Practice which would set out standards for identification and verification of identity online. This responds to the current discourse on tackling anonymous online “trolls” versus the benefit derived from anonymous accounts and suggests that the recommendation aims to strike a balance between these two interests.
The report recommends simplifying the test of what content is harmful to children by specifying on the face of the draft OSB the forms of content which should be deemed “harmful” and, where this cannot be done, that services are required to make their own judgment, aligning the test for such judgment calls with the approach taken by The Law Commission when devising new harm-based communications offences.
To assist services, the report recommends that Ofcom produces a binding Code of Practice on how to protect children from illegal and harmful content, which should address, among other things, how to achieve proportionate deployment of age assurance methods.
Much has already been made of the hefty powers being placed into Ofcom’s hands by the draft OSB. However, the JSC report contains a number of recommendations which underline the breadth and centrality of the role the regulator will play in the new regime.
The report emphasises the need for Ofcom to start work immediately on developing the risk profiles for different types of regulated service and lists out factors which should be taken into account when creating such profiles. These profiles are key for services to carry out their own risk assessment, an obligation which lies at the heart of the draft OSB.
The report recommends that all Codes of Practice are to be binding and that Ofcom should start work on them immediately. The JSC takes the view that Codes of Practice should be produced on many areas, including terrorism, CSEA, regulated content and activity for adults, child online safety, safety by design, age assurance, freedom of speech, moderation and reporting.
The JSC has also recommended that each regulated service should have a designated “safety controller” – who is either on the board or reports to the board – who could be held criminally liable in the event that the service provider fails to comply with its obligations repeatedly, causing a significant risk of serious harm to its users.
The report provides a useful insight into how the draft OSB might develop in certain fundamental respects. What stands out from the recommendations in particular:
- The JSC has honed in on certain aspects of the online world as warranting closer regulation in particular – namely, services which pose risk to children, and use of algorithms. This is no real surprise given some of the headline-grabbing allegations which were made about large platforms’ practices last autumn. It is safe to expect that these areas will now probably remain the central focus during the draft OSB’s passage through Parliament.
- We have gleaned some helpful indications as to how Ofcom may assess risk (which will in turn guide how services should assess risk too). This has been a key unanswered question for services to date and has therefore prevented services from really being able understand how they might fit into the new regime and so start taking preparatory steps.
- Baked into the entire legislative proposal is the idea that any steps required of services will be proportionate to the risk they pose. Whilst we now have a better idea on how risk might be assessed, what is proportionate action in light of that risk is a question we still don’t know how Ofcom will answer. However, what has also emerged from the JSC’s work is that the Ofcom Codes of Practice will be absolutely key to compliance. If they are made binding, we understand that the choice to “opt out” and achieve compliance in an “equivalent” way may no longer be open to services. In which case Ofcom is going to need to be as prescriptive as possible in its Codes of Practice as to how services can achieve proportionality.
- It appears that amendments to the draft OSB, if the JSC recommendations are adopted, could go some way to rationalising unfamiliar legal concepts such as “harmful content” and so reduce the burden placed upon services to make difficult judgment calls as to what’s in and what’s out of scope. We’d expect the “reasonably foreseeable harm” test to remain a hotly-debated topic as the draft OSB progresses, given the subjectiveness of this test when applied to different categories of user.
Moving forward – what’s next?
A revised draft OSB is expected to be published shortly, following which it will be placed before Parliament for consideration and debate. The stated goal remains for the draft OSB to be passed by the end of this year, but it remains to be seen whether this is achievable given the number of issues identified by the JSC.
Please also find our article on the draft OSB here: https://www.twobirds.com/en/news/articles/2021/uk/draft-online-safety-bill-published-landmark-legislation-set-to-be-considered-by-mps