Recent riots and civil unrest during the summer, allegedly spurred by online content, have once more pushed online safety to the forefront of discussions in the UK. This, combined with the new Government’s promise to speed up the implementation of the UK’s Online Safety Act 2023 (OSA), has seen Ofcom put under pressure to explain when and how the OSA’s main duties would take effect. Indeed, on 16 October 2024, the Rt Hon Peter Kyle MP, Secretary of State for Science, Innovation and Technology, wrote to Ofcom emphasising that it is “incredibly important” that the OSA is in place “as soon as possible”.
The challenge Ofcom faces is that the provisions of the OSA cannot simply be “switched on”. Instead, Ofcom must first undertake a process of gathering evidence, formulating draft codes of conduct and guidance, consulting on those proposals and then either laying the codes of practice before Parliament or releasing final guidance. Even once they are finalised, in-scope services need time to review and comply with the final codes and guidance.
On 17 October 2024, Ofcom explained what it has being doing since the OSA became law a year ago and laid out an updated roadmap for what comes next. Anyone working in trust and safety for an in-scope service now has their “to do” list for 2025…
What comes next?
Ofcom has previously laid out its roadmap for when the various substantive duties will take effect. Though there are no dramatic changes to the plan, the timings for various milestones are now more definitive and, with each passing day, looking increasingly imminent:
- Illegal harms: Ofcom provided updated timing for the illegal harms Codes of Practice and illegal content risk assessment guidance. We can now expect these to be published in December 2024. In-scope services will be required to complete their illegal content risk assessments by mid-March 2025 and to comply with the Codes of Practice by the time they have passed through Parliament, which will also likely be in March 2025.
- Children: The children’s access assessment guidance is due to be published in January 2025 and in-scope services will have to assess whether their service is likely to be accessed by children by April 2025. Ofcom then expects to publish the Protection of Children Codes of Practice and risk assessment guidance in April 2025. Services likely to be accessed by children must carry out a children’s risk assessment within three months, so likely by July 2025.
- Categorised services: Ofcom expects the Government to pass the secondary legislation confirming the thresholds for categorisation set out in its advice dated 29 February 2024 by the end of 2024. On that basis, it plans to request information from services to work out which ones should be categorised and then issue draft transparency notices to services “within a few weeks” of the register of categorised services being finalised. Categorised services will then need to publish their first OSA transparency reports by around the end of 2025.
Ofcom has “reprioritised” (delayed) the publication of the draft Codes of Practice on the other duties that apply to categorised services. It had previously indicated that these would be published in early 2025, but now expects publication by early 2026.
- Protecting women and girls: Ofcom now expects to publish draft guidance on online services’ role in protecting women and girls in February 2025.
As you can see from our graphic, this leaves 2024 and 2025 looking very busy for in-scope services:
What else is significant?
As well as clarification and updates on timing for the illegal harms and protection of children safety duties, the update also provides further insight into Ofcom’s current thinking and approach.
World-leading measures: Ofcom sees itself as operating at the forefront of online safety regulation and is keen to point out that it considers that its proposed Codes will require changes that have not been proposed by any other regulator around the world. For instance, Ofcom believes its measures will materially expand the number of what it calls “high risk services” that use hash matching to detect CSAM and will go far beyond current industry practice when it comes to age assurance, safer algorithms and other tools to help children stay safe.
Governance, design, user control and trust: Ofcom groups these measures into the four key areas where it wants to “deliver change” as the OSA is implemented: governance, safety by design, user control and building trust. The focus on governance comes through throughout the update, including an emphasis on in-scope services identifying senior people who are accountable for user safety. The trend for individual accountability seems to be transcending industries (see our previous discussion about a similar topic here: Blame it on me: Would a senior managers regime work for online harms?).
Other points of interest include:
- CSEA or terrorism notices consultation: In December 2024, Ofcom will launch a consultation on how it plans to use its power to require a regulated provider to use or develop technology to deal with CSEA or terrorism content.
- SME support: Ofcom has plans for a new ‘Digital Support Service’ of “interactive digital tools” to assist in-scope small and medium-sized enterprises’ efforts to comply.
- Super-complaints: Ofcom expects the guidance on ‘super-complaints’, which will include information on when an entity can be eligible to make a ‘super-complaint’ and the procedures for making a ‘super-complaint’ to be in place by Q4 2025.
- Repeal of the VSP regime: The Video Sharing Platforms regime is likely to be repealed after the Protection of Children Codes come into force around July 2025; and
- And this isn’t the end: It is more evident than ever that compliance with the OSA is not going to be ‘one and done’. Ofcom states that it already intends to build on the first editions of the illegal harms and children’s Codes of Practice with further measures, which it intends to consult on in spring 2025. This will include additional measures concerning automated tools to proactively detect illegal content and content harmful to children.
Likewise, it remains to be seen if the political debate around online safety leads to amendments to the OSA (for instance, to reintroduce the legal but harmful provisions that were dropped during the drafting of the OSA) or the introduction of entirely new laws (see, for instance, the private members bill proposed by Josh MacAlister, MP for Whitehaven and Workington).
What if we don’t comply?
Ofcom has emphasised on multiple occasions that it is ready to enforce and does so again in its update (“we are ready to launch immediate enforcement action”).
In particular, Ofcom say that they “expect [their] early enforcement action to focus on ensuring services are adequately assessing risk and putting in place the measures that will be most impactful in protecting users, especially children, from serious harms such as those relating to CSAM, pornography and fraud”.
Ofcom’s enforcement powers are broad, including the ability to impose fines of up to 10% of worldwide revenue, take business disruption measures and even take criminal action in certain circumstances. These powers were covered in depth in our webinar from November 2023 (Online Safety Act Webinar Series - session 2).
In-scope services face a wave of regulation, of which the OSA is just one part, and 2025 looks set to be busier than ever.