The UK’s new Online Safety Bill breaks significant new ground. The Bill is intended to minimise interference with lawful speech, and focuses on illegal content, but risks significant interference with communications between private citizens. Importantly, the risk comes just as much from the provisions on illegal content, as it does from the much looser – and potentially soon to be amended - rules on “lawful but harmful” content.
Hard algorithmic trade-offs
Social media companies and search engines must clamp down not just on egregious images of child sexual abuse and terrorist propaganda, but also a plethora of other priority illegal content such as content that constitutes fraud by false representation or which assists unlawful immigration.
There is no means to automatically identify this content at scale with absolute accuracy. Tools exist to detect known illegal images (such as PhotoDNA) but determining if some of these other priority offences have occurred will be extremely difficult for a human - let alone a machine – given these offences often turn on the knowledge or intent of the poster[1] which is not obvious from the face of the post. There is no classifier capable of accurately identifying all posts that tip into assisting unlawful immigration, let alone those that cause “spiritual injury”[2].
This means hard trade-offs between the sensitivity and precision of any system – i.e. should it aim to block almost all illegal content even if that means significant over-blocking of lawful communications? Or should it only block in the limited cases where the content is definitely illegal? The punitive sanctions for breach risk these algorithms being programmed to err on the side of caution.
Spaghetti-like legislation in a political pressure cooker
This is just one aspect of a live and contentious debate about this new law. Kemi Badenoch as part of her campaign to become leader of the Conservative Party accused the Bill of “overreach” and “legislating for hurt feelings”.
The then Secretary of State fiercely opposed this interpretation but the new Prime Minister, Liz Truss, has indicated that the Bill be watered down in response to these concerns and as the retired, but not retiring, Lord Sumption notes, the Bill is a “complex paper chase of definitions, its weasel language suggesting more than it says, all positively invite misunderstanding. Parts of it are so obscure that its promoters and critics cannot even agree on what it does”.
Ofcom may need to take a centre stage approach to mould this spaghetti-like legislation into a workable regulatory regime. It also comes with a mighty stick. Breach of the new law can result in fines of £18 million or 10% of annual turnover, coupled with other powers such as the ability to block access to web services and so effectively exclude operators from the UK market.
Challenging Ofcom will be hard…
Despite Ofcom’s significant and contentious new role, challenging its decisions will be an uphill struggle. The vast majority of Ofcom’s decisions – including issuing corrective orders and fines – can only be appealed to the Upper Tribunal using judicial review principles.
Judicial review is a flexible remedy capable of shaping itself to the questions raised. In some cases, it might even require a consideration of the underlying merits, such as where required on human rights grounds.
However, the Upper Tribunal’s appetite to unpick Ofcom’s decision will also inevitably be tempered by the nature of the decisions. Appeals will raise complex technical issues (such as the ability of machine learning classifiers to identify illegal material) and social and economic judgements (do we want a safe internet or a free one?). With such complex and open-textured questions, the Tribunal might understandably shy away from deciding Ofcom’s decisions are so unreasonable no regulator would take them.
In other words, overturning Ofcom’s decisions is only likely to be possible if there is illegality, irrationality or procedural impropriety. This will be a very high threshold even for obviously bad decision making.
Why not a full merits review?
One way to apply a better check and balance is to allow a full merits review. The Upper Tribunal could be instructed to consider both if the decision is lawful but also, where it involves discretion on Ofcom’s part, if that discretion should have been exercised differently.
Many other regulatory decisions are subject to a full merits review, such as appeals against UK GDPR fines from the Information Commissioner. This does not involve disregarding the original decision and starting afresh. The Upper Tribunal would still have to pay careful attention to the reasons for the original decision and does not become a “fully equipped duplicate regulatory body waiting in the wings” (per Lord Jacob in T-Mobile v Ofcom (2009)).
Perhaps the judicial review standard is preferred because of Ofcom’s scars from the early days of the Communications Act 2003 where its decisions were routinely appealed regardless of the merits. The Government described these appeals being used as a “one-way bet, and a chance to re-open regulatory decisions, encouraging lengthy and expensive litigation and holding back decision-making” (see the Streamlining Regulatory and Competition Appeals consultation). This led eventually to section 87 of the Digital Economy Act 2017, which imposed a judicial review standard for many, previously merits-based, appeals.
“Quis custodiet ipsos custodes” or “who guards the guardians”?
So, who cares? Why create a situation in which major tech platforms can use their deep pockets to launch appeals as a “gaming tactic either to delay specific decisions or more generally to discourage regulators from making more radical or controversial decisions”.
The answer comes from the volatile and high-profile nature of social media. Events will, from time to time, lead to Ofcom being placed under extreme public and political pressure to “do something”; egged on by a press calling for imaginary technological solutions to provide simple answers to complex problems.
Holding the line in these situations will be very hard but a full merits review keeps a regulator honest. They know that they have to do their job properly or risk having to justify themselves to an independent court.
In the absence of a full merits review, what other options are available to ensure fair and predictable regulation? Some answers come from the Bill itself. Ofcom is obliged to draw up a wide range of guidance and statutory Codes. These will put the flesh on the bones of the Bill and must be subject to a public consultation.
Ofcom has also consulted widely with stakeholders on an informal basis and provided a clear and detailed overview of the implementation of this new law through its “roadmap to regulation”, including its plan for the first 100 days (see our previous post).
This is all welcome but is not enough. Good regulation should not depend on the good intentions of the regulator or the length of Lord Grade’s foot. Parliament needs to do more as the Bill progresses. For example:
- Mixed approach: Even if a full merits review is not appropriate for all of Ofcom’s decisions, there is a very strong case for it to apply in relation to fines and other sanctions.
- Incentive to early settlement: A full merits review could be coupled with incentives for early settlement of a sanction. For example, the Information Commissioner offers a 20% discount for early settlement of GDPR fines. The Financial Conduct Authority provides a similar 30% discount. This deters meritless appeals.
- Alternative sanction resolution: It is also worth considering if there is a half-way house between early settlement and full merits challenge. The Financial Conduct Authority has introduced “focused resolution agreements” under which firms accept some parts of their regulatory decision (e.g. particular factual findings/breaches) and challenge others (e.g. challenging other breaches or aspects of the penalty). This might have a role here.
- Advisory panel: Given the complex societal questions raised by some of these powers, Ofcom needs to avoid issuing dry technocratic edicts. It could set up and consult an advisory panel representing stakeholders with a wide range of interests, including those in favour of this new framework (e.g. those working in child protection and counter-terrorism), those against it (e.g. freedom of speech campaigners) and those who actually understand how the technology works. The panel should be consulted and provide a public non-binding opinion on Ofcom’s decision.
This model has been used successfully to advise on complex technological and societal issues in both industry, such as Meta’s Oversight Board, and in Government, such as the statutory Technology Advisory Panel set up to provide advice to the Investigatory Powers Commissioner. Indeed, Ofcom already benefits from a number of advisory panels such as the Spectrum Advisory Board and the Advisory Committee for Older and Disabled People.
As Lord Reed points out in A v British Broadcasting Corporation (2014), “society depends on the courts to act as guardians of the rule of law. Sed quis custodiet ipsos custodes?”: but the courts can only exercise the discretion afforded to them. Parliament should ensure that the Courts are given the right powers to ensure that Ofcom regulates this space fairly and effectively.
[1] The offence of assisting unlawful immigration under section 25 of the Immigration Act 1971 requires knowledge the act will facilitate a breach, or attempted breach of, immigration law and knowledge the relevant individual is not a UK national.
[2] As @cyberleagle points out, there are proposals to amend the Bill to include “foreign interference” under the National Security Bill as illegal content to which the Online Safety Bill applies. One potential element of this offence is an act “causing spiritual injury”.