Tomorrow the Parliamentary Committee that has been scrutinising the UK's draft Online Safety Bill (OSB) will publish its report setting out recommendations on how the draft bill should change before it enters Parliament. The Committee has heard evidence from a wide cross-section of parties: from those platforms likely to be within scope to non-for-profits and lobby groups; from banks and insurers to former Premier League footballers:
Navigating these submissions - and the many others that the Committee has received - to craft a cohesive and effective OSB is no easy task. Tomorrow we'll discover just how the Committee proposes to achieve it.
Input from regulators
All of those giving evidence have shared valuable perspectives but perhaps some of the most interesting submissions came from the regulators who also have some remit over online platforms. Ofcom will be given responsibility to oversee and enforce the Online Safety Bill once it comes into force, but (as noted in our previous blog posts) the Information Commissioners Office (ICO), the Financial Conduct Authority (FCA) and the Competition and Markets Authority (CMA) all have some jurisdiction in this area too. What did they each have to say about the scope of the OSB?
Information Commissioners Office evidence
The ICO's evidence* focused on the work they have already done in the sphere of online harms. Despite being the regulator for personal data - not online content - the ICO's Age Appropriate Design Code contains many requirements that appear to be more about the access to online content by children, rather than the use of their personal data.
The ICO say that their primary concern about the OSB is the need to clarify how the data protection and online safety regimes will interact: to avoid the risks of duplication, inconsistent decision-making and confusion for organisations and the public (a theme we have previously read more).
Financial Conduct Authority evidence
The FCA's evidence* reflected a concern that has permeated many of their public statements over the past year (read more). They are laser-focused on tackling online scams, which have proliferated over the course of the pandemic. The main change the FCA want to see to the OSB is to ensure that online platforms' responsibilities extend not only to user-generated content but to paid-for advertising too. As the FCA neatly explained: "as the Bill stands if a criminal were to post a scam in a local parents’ social media group, the online platform would be compelled to remove it if alerted.
However, if the same criminal paid the platform to promote that scam to thousands of individuals, the platform would not be obliged to take action under the Bill." The rest of the FCA's request mirrored this overarching theme: asking that fraud be designated as "priority illegal content" (therefore requiring platforms to take proportionate steps to minimise its presence and dissemination, rather than merely swiftly removing it) and explaining how the FCA would apply the financial promotions regime in conjunction with OSB (read more).
Competition and Markets Authority evidence
Finally, the CMA's evidence* was perhaps the most unexpected of all the regulators' submissions. They stated that the OSB "risks inadvertently setting a lower standard of consumer protection on platforms for economic and financial harms than that already envisaged by current law". This is because "under existing law, where platform operators fail to act with professional diligence – that is, by taking adequate steps to effectively address economically harmful illegal content on their platforms – and that failure is likely to distort UK consumers’ economic behaviour, they are likely to infringe the general prohibition in the [Consumer Protection from Unfair Trading Regulations 2008]."
The CMA did note that many platforms didn't agree with their reading of the law - saying that online platforms typically contend that the existing law cannot make them responsible for the content of third parties, cannot impose general monitoring obligations on them and only requires them to remove illegal content once they are notified of it. But the CMA were keen to ensure that, either way, the OSB does not inadvertently lower the bar required of platforms' online safety measures.
*Each time, we've linked to the core written submissions made by each regulator but each also gave oral evidence and some followed up with further written correspondence. All can be found at: Draft Online Safety Bill (Joint Committee) - Publications - Committees - UK Parliament
Discover our Tech Legal Outlook 2022 publication, in which we explore the key global trends in the technology sector that will shape the legal outlook in 2022. Click here to download your copy.