Following my December overview article – Online Safety Bill: Aims and Limitations – there have been some significant developments. The Joint Committee has published its long-awaited report and recommendations, having spent 5 months investigating the draft Bill, heard from 50 witnesses and received 200 submissions. In short, it considered that the Bill was overly complex, that it lacked clarity in several key respects and recommended that it should be entirely re-structured in order to better meet its stated ambitions. However, it also made a number of substantive recommendations relating to protections.
The report itself makes for fascinating reading but the scope of this article is restricted to the headline points relating to increasing protection against certain online harms. So what are the main recommendations and what is the Government response?
The JC recommended that the Bill be significantly strengthened in four ways:
- What is illegal offline should be regulated online – i.e. if it is a criminal law offline – e.g. stirring up hatred against disabled people, it should be regulated online. The Committee’s view is that a law aimed at online safety that does not require companies to act on, for example, misogynistic abuse or stirring up hatred against disabled people would not be credible.
- Ofcom should be required to issue a binding Code of Practice to assist providers in identifying, reporting on and acting on illegal content, in addition to those on terrorism and child sexual exploitation and abuse content.
- The Government should implement the Law Commission’s recommendations in their Modernising Communications Offences for new criminal offences in relation to a number of harmful online activities, to replace the outdated ones under, in particular, the Malicious Communications Act 1988.
- Pornography should not be accessible to children. They recommend that all statutory requirements on user-to-user services, for both adults and children, should also apply to Internet Society Services likely to be accessed by children, as defined by the Age Appropriate Design Code, with the aim of ensuring that all pornographic websites would have to take serious measures to prevent children from accessing their content.
The Government has responded largely through various DCMS announcements in February and March this year. The response so far is as follows:
- The Law Commission’s recommendations or a harm-based communications offence, a false communications offence, and a threatening communications offence, will be brought into law through the Bill. Further consideration would be given to the Commission’s other recommendations for offences relating to cyberflashing, hoax calls, encouraging or assisting self-harm, and epilepsy trolling.
- Further priority offences would be set out on the face of the Bill which would include incitement to and threats of violence, hate crime, and financial crime. (Offences relating to terrorism and child sexual abuse and exploitation are already listed). This is a very significant development insofar as financial crime is concerned as this was previously not within the scope of the Bill at all. Allied to this point is the recent announcement, on 9 March 2022, that category 1 and 2A services (i.e. the largest user-to-user and search services) would have a duty to prevent the publication of paid-for fraudulent adverts. This attempts to capture ‘scam adverts’ although the detail of it will be left to Ofcom to create and publish specific Codes of Practice that will set out how services can comply with the new duty. Separately, the Government launched an Online Advertising Programme consultation which runs for 12 weeks (until 1 June 2022) intended to address how online advertising should be regulated. This can be accessed here.
- The Bill would be amended so that all providers who published or placed pornographic content on their services would need to prevent children from accessing that content.
- Online abuse including anonymous abuse, would be tackled by the introduction of two additional duties on category 1 service providers, namely:
- a “user verification duty” would require category 1 providers to give adult users an option to verify their identity. Ofcom would publish guidance setting out how companies could fulfil the duty and the verification options that companies could use.
- a “user empowerment tools duty” would require category 1 providers to give adults tools to control who they interacted with and the legal content they could see.
In terms of timing, the Online Safety Bill will be introduced “as soon as possible” (Chris Philp, on behalf of DCMS, written answer on 25 Feb 2022) but no date has been given for its introduction as yet. It remains to be seen how the changes will be inserted into the draft Bill and whether the JC’s concerns about structure and loopholes will be addressed. The focus remains very firmly on Ofcom and it will need considerable resources in order to implement its new and extensive functions.
Samantha Broadfoot QC is a specialist practitioner in public law and human rights, with an interest in data protection and regulation. She is also an Assistant Coroner and a Recorder.
To subscribe to our Health and Social Care Insight and get the blog posts sent straight to your inbox, click here.