After several months of heated public debate on the regulation of online harms in the UK, but very little in the way of substantive updates, last week saw a flurry of activity with three highlights of interest to online platforms hosting user-generated content:
(1) New harm-based communications criminal offence proposed
First, the Law Commission published its final report recommending reforms to the communication offences to target serious harms arising from online abuse, while more effectively protecting the right to freedom of expression online.
The recommendations include the creation of a new harm-based communication offence. This would criminalise behaviour where:
- the defendant sends a communication that is likely to cause harm (i.e. serious distress);
- in sending the communication, the defendant intends to cause harm; and
- the defendant sends the communication without reasonable excuse (such as contribution to a matter of public interest).
This new offence is intended to help to address pile-on harassment, where several individuals target the same person with harmful communications (the Law Commission was not persuaded that a separate, specific offence of pile-on harassment was necessary or desirable). The report also includes recommendations for other new offences, including offences to tackle cyberflashing and the encouragement of serious self-harm.
The Law Commission recognises that the Online Safety Bill will be passed in parallel with any reform to the criminal law. Its view is that social media platforms are better placed to take action to prevent harmful content than the criminal law, which is “an expensive tool with limited resources to deploy”. It is refreshing to see such an honest admission of difference between passing laws and changing behaviours in practice.
(2) Online Safety Bill faces freedom of expression criticisms
Second, the House of Lords Communications and Digital Committee published its report on freedom of expression in the digital age, which focuses heavily on the draft Online Safety Bill. The report praises elements of the proposals, such as the requirement for platforms to remove illegal content. It also makes certain criticisms of the Bill, including that:
- The duties relating to content that is legal but may be harmful to adults are unworkable, and could not be implemented without unjustifiable and unprecedented interference with freedom of expression. The report proposes replacing these “legal but harmful” duties with a new design duty, obliging platforms to demonstrate they have taken proportionate steps to mitigate the risk of encouraging and/or amplifying uncivil content.
- Platforms, in complying with their duties to identify and remove content criminalised by the Law Commission’s proposed harm-based offence, will also remove legal content. This criticism demonstrates the difficult balancing act platforms will face in removing prohibited content without disproportionately impacting users’ freedom of expression.
It remains to be seen whether these freedom of expression-based concerns will be taken up by the Joint Committee responsible for scrutinising the draft Bill but this development will be welcomed by commentators who have expressed concerns about the wide-ranging obligations in the draft bill.
(3) Parliament presses ahead with pre-legislative scrutiny of the Bill
Finally, the Joint Committee on the Draft Online Safety Bill has been established before the end of summer term, showing that the government is pressing ahead after coming under criticism for not progressing the bill quickly enough. The Committee will meet for the first time on Tuesday 27 July to elect its Chair, and it must report by 10 December.
This means the Bill itself will still only be passed in 2022 at the earliest. A call for written evidence is expected in due course - in the meantime platforms should be turning their attention to their online harms compliance programmes.
For more information on the draft Online Safety Bill, see our ‘at a glance’ summary.