This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 3 minute read

Ofcom motors ahead on online safety: Three trends for online platforms and search services to be aware of

While Europe’s Digital Services Act has been finalised with the key obligations set to apply to the biggest platforms later in 2023, the UK’s supposedly world-leading equivalent law – the Online Safety Bill – continues to edge its way through the legislative process. The Bill is currently in the House of Lords and looks unlikely to receive Royal Assent until autumn at the earliest.

However, Ofcom is not standing still. Far from it, Ofcom has reportedly recruited 300 staff already to enforce the Bill, with over 100 hires still to come. This includes former police officers, counter-terrorism and child abuse experts from the National Crime Agency, as well as individuals from Big Tech and former employees of other regulators such as the Information Commissioner’s Office and the Financial Conduct Authority. It has already concluded its first call for evidence on its first area of priority, illegal content, and next week the deadline will expire for responses to its second call for evidence focused on content that may be harmful to children. It has also released a range of papers on topics such as automated content moderation, transparency reporting and risk assessments, with apparently over 100 more research papers in the works.  

So plenty of detail to digest, but what broader trends can those likely to be within scope of the Bill – user-to-user platforms and search services – take away from Ofcom’s activity to date?

Ofcom will be ready, so you should be too: 

First, Ofcom has made it clear that it will be ready for this Bill and so those in scope must be ready to comply. In recent evidence to a Parliamentary Select Committee, Dame Melanie Dawes, CEO of Ofcom, said that the Codes of Practice on illegal content and the protection of children will be released for consultation “immediately” after the Bill passes, rather than in the first 100 days. 

This will be when the industry sees, for the first time, what Ofcom will expect as a regulator in this area and therefore the moment where “rubber hits the road” and conversations with those in scope will get “more serious and more real”. 

Ofcom has already previewed this in its approach to the regulation of video-sharing platforms, where its report on the first year of regulation emphasised the need for platforms to be ready to be regulated. That said, representatives of Ofcom have also made it clear that regulation will not be all-stick and no carrot: indeed, Mark Bunting, Online Safety Policy Director at Ofcom, said in a recent interview that Ofcom is hiring and training a supervisory unit that will look to engage one-to-one with the biggest and riskiest services.

It’s not all about take down: 

Second, lots of content regulation regimes globally focus on the take down of individual pieces of content. The Bill is instead focused on the overall systems and process that services put in place. 

Ofcom has emphasised repeatedly that the focus is not just on the take down of violating content but on the preventative measures that can be deployed to prevent users encountering certain types of content or conduct in the first place, for instance age-gating pornographic material, using hash-matching technology to block child sexual abuse imagery and making design changes to prevent the grooming of children

Those in scope should therefore be thinking not just about how they react to illegal content or content harmful to children, but about the features that can be built in at various stages to prevent users encountering these.

It’s time to get serious about risk: 

Finally, Ofcom’s focus on the full product lifecycle is just one part of its broader focus on ensuring those in scope embed good risk management in their organisations. In this week’s document explaining how Ofcom is approaching risk assessments, Ofcom says that a fundamental part of its role as a regulator is to “promote good practice around risk management as a fundamental part of service design and organisational culture”. 

It adds that “good risk management relies on a culture of risk awareness and risk prioritisation by all teams across an organisation” and cites internationally recognised standards (such as ISO 31002) and the three lines of defence model as exemplars of good practice in this regard. 

What this means is that those in-scope should not think of risk assessments as being an isolated, occasional task for a compliance department that belongs to a galaxy far, far away – but a fundamental part of broader risk management processes, which those at the senior level own and scrutinise regularly.

Looking ahead

It is clear that Ofcom is using the additional preparation time gifted by the delay of the Bill effectively, and platforms would be wise to do the same. The longer the Bill takes to pass, the less patience Ofcom is likely to have for lack of preparedness.

"the starting gun for us as a regulator will come later than we were assuming last summer, but we are trying to be ready to move quicker when that moment arrives"

Subscribe to our Tech Insights blog for insights, updates and news from our experts - subscribe now!

Tags

online safety