This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 8 minute read

Enforcement of the UK’s Online Safety Act: where are we?

Ofcom, the UK’s online safety regulator has kicked off enforcement under the UK’s Online Safety Act 2023 (OSA)– what can we learn from the investigations it has launched so far and what should in-scope platforms expect for the rest of 2025?

The OSA took several years to go from idea to law – and a  further year and a half from being passed to the substantive duties beginning to take effect. However, as we head into the summer of 2025, the main duties in the OSA concerning illegal content and content that is harmful to children are taking effect and Ofcom is ramping up its enforcement activity accordingly.

 The OSA applies to tech companies which offer user-to-user services (e.g. social media platforms, messaging services, file share services etc.), search services and those that offer pornographic content.

What enforcement action has Ofcom taken so far under the OSA?

Ofcom’s practice to date has been to launch dedicated “enforcement programmes” on certain topics. These are dedicated programmes of work to assess providers’ compliance in targeted areas. Ofcom has launched three enforcement programmes under the OSA so far, under which Ofcom has recently launched a raft of new investigations.

1. Age assurance for pornography (see here)

One of the first parts of the OSA to take effect was the duty on so-called “Part 5 services” (i.e. services on which “regulated provider pornographic content” (i.e. not user-generated pornographic content) is published or displayed). Part 5 Services were required to put in place highly effective age assurance (HEAA) measures to prevent children from accessing pornographic material by January 2025. 

On 16 January 2025, Ofcom announced it had opened an enforcement programme into age assurance measures across the adult sector, which it said would focus initially on regulated providers’ compliance with their duties under Part 5 of the OSA.

To date, Ofcom has opened three investigations under this enforcement programme. All are examining whether smaller platforms have failed to implement HEAA measures on their services, as required under section 81 OSA. The platforms concerned include Itai Tech Ltd, (in relation to the ”nudification” site, Undress.cc); Score Internet Group LLC (in relation to one of their studio sites, Scoreland.com), and First Time Videos.  

However, some of these investigations have proved short-lived; such as that into Score Internet Group LLC, which implemented HEAA to Ofcom’s satisfaction. Nonetheless, Ofcom notes in the investigation closure that it will continue to monitor the service, and that: “Should new concerns arise—such as the availability of pornographic content without effective age assurance measures in place - Ofcom’s Enforcement team may revisit this decision.”

2. Child sexual abuse imagery on file-sharing services (see here)

Ofcom’s Illegal Harms Codes  identified file-sharing and file-storage platforms as user-to-user services that posed a “high risk of image-based child sexual abuse material (CSAM)”. Accordingly, on 17 March 2025, Ofcom launched a specific enforcement programme in relation to file-sharing/storage sites and the risk that they are used to share CSAM. 

To date, Ofcom launched seven investigations under this programme, focused on smaller, more riskier file sharing services. These are:  Im.geKrakenfilesNippyboxNippydriveNippyshareNippyspace and Yolobit. All seven of these investigations concern whether the respective platforms failed to: 

  • respond to a statutory information request
  • complete a suitable and sufficient illegal content risk assessment
  • make and keep a record of this assessment; and
  • comply with the safety duties about illegal content which apply in relation to regulated user-to-user services.

Ofcom has a distinct taskforce to engage with smaller sites that may present particular risks to users.

3. Illegal content risk assessments (see here)

On 3 March 2025, Ofcom announced its third enforcement programme which would focus on whether providers are complying with their illegal content risk assessment duties and record-keeping duties under the OSA. Shortly following the deadline for regulated services to produce their illegal content risk assessments, Ofcom issued various information requests and received over 60 such risk assessments, including from a range of large services and smaller services posing particular risks.

On 9 April 2025, Ofcom launched its first investigation under this enforcement programme. This concerned an unnamed online suicide discussion forum on the basis that it has failed, or is failing, to comply with its duties to protect its users from illegal content, as well as the duty to respond accurately to an information notice. According to Ofcom, the investigation was launched after having made ”several attempts to engage with this service provider in respect of its duties under the Act” and having issued “a legally binding request to submit the record of its illegal harms risk assessment to us.” (see here).

Ofcom’s second investigation under this programme was launched on 14 May 2025 into Kick Online Entertainment S.A. (Kick). Similar to the suicide forum, this investigation is looking into whether Kick has failed to comply with its duty to carry out an illegal content risk assessment and has failed to respond to an information notice. 

On 10 June 2025, Ofcom launched its third investigation under this programme, 4chan. The investigation is wide ranging and concerns non-compliance with the safety duties about illegal content, following complaints about the potential for illegal content and activity on 4chan, and possible sharing of child sexual abuse material on the file-sharing services. The investigation will also concern the platform’s alleged failure to respond to a statutory information request and complete and keep a record of their illegal content risk. 

What can we learn from Ofcom’s early enforcement work?

Though Ofcom’s enforcement of the OSA is in its infancy, there are already several themes that emerge:

  1. Ofcom has to date focused on “low-hanging fruit”: those platforms that they suspect have simply done nothing (or next to nothing) to comply with their duties under the OSA. This includes services that simply have not responded to Ofcom information notices, that have clearly not implemented HEAA or that appear not to have completed an illegal content risk assessment. However, it is likely that Ofcom’s enforcement focus will turn in the coming months to those that have done something but which Ofcom deems to be insufficient. For instance, those platforms whose risk assessments are considered not to be “suitable and sufficient”, those that have implemented some form of age assurance, but which Ofcom deems not to be highly effective and those that have not implemented sufficient measures to comply with their duties.
     
  2. Ofcom has demonstrated its willingness to pursue investigations under the OSA into smaller/riskier platforms, including those without an obvious presence in the UK. One development we can expect to see in the coming months is how Ofcom will deal with platforms that dispute Ofcom’s jurisdiction or that simply ignore them, as the online suicide forum has suggested it may well do. Ofcom may well have to deploy the business disruption powers under the OSA, that enable it to ask the Court to impose measures to disrupt the businesses of non-compliant services (more on this below).
     
  3. Ofcom has been transparent that it cannot investigate every suspected failure to comply with the OSA and will therefore have to pick and choose enforcement cases that meet its priorities. The clear pattern to date is that Ofcom will open targeted enforcement programmes, issue compelled information notices to services they suspect are in-scope and then to open investigations depending on those services’ responses (or lack thereof). We can expect this to continue as further parts of the OSA take effect, most notably the children’s duties that come into force after 24 July 2025.
     
  4. Many of Ofcom’s investigations include examining whether the service in question has failed to respond adequately to a compelled information notice. This is a theme that continues from Ofcom’s enforcement work under the predecessor “video-sharing platforms” regime, where it imposed fines of £1.875m (on TikTok) and £1.05m (on the parent company of OnlyFans) for failures to respond adequately to compelled information notices.

What will enforcement look like? 

Much has been said about the extent of Ofcom’s powers to fine services up to £18 million or up to 10% of qualifying worldwide revenue. But Ofcom’s enforcement powers are broader than merely imposing financial penalties and some of the sanctions in their toolbox are potentially more intrusive and impactful. 

Ofcom has a broad range of powers: such as forcing a service to take steps to come into compliance (Section 130 (6) OSA) or requiring them to pay a penalty (Section 130 (7) OSA). 

At the more extreme end, Ofcom has various practical powers that could significantly alter or disrupt the way in which services operate. Ofcom’s business disruption measures (Sections 144-8 OSA), can be interim or permanent, but in any case, could be hugely impactful for a service – for example:  

  • Service Restriction Orders could require that “ancillary services” (for example payment processors or ad servers) take steps aimed at disrupting the non-compliant provider’s service (either conduct or business or revenue). 
     
  • Access Restriction Orders could require those who provide “access services”, like internet access services or app stores, to take steps to impede access to the non-compliant service.

Ofcom’s enforcement arsenal will also include the ability to issue Technology Notices (Section 121 OSA). Under this power, Ofcom can require a service to either use an accredited technology or use best endeavours to develop / source a technology to identify CSEA and / or Terrorism content or to swiftly take down that content or prevent individuals from encountering it. It is expected that Ofcom will have completed the various consultations and issued the necessary guidance for this power to take effect during 2026. 

There are various limitations and safeguards on Ofcom’s use of these powers, and Ofcom has acknowledged that these measures amount to a “significant regulatory intervention”, and as such will not apply for such measures as a matter of routine; however, platforms should be alive to the possibility of such disruptive and impactful powers in due course. 

Who’s next? 

Ofcom will strategically select cases, as guided by their enforcement priorities framework. The risk of harm or seriousness, the strategic significance, and resource implications and risks are all factors Ofcom will consider in whether to launch enforcement action. 

Platforms now have to complete and record children’s risk assessments by 24 July 2025 and should bear in mind Ofcom’s enforcement appetite to enforce against failure to meet compliance deadlines. 

Ofcom has previously noted that platforms will have a six-month grace period during which to come into compliance in relation to the Codes of Practice for Illegal Harms. However, this is not directly referred to in Ofcom’s Protecting Children statement, where Ofcom instead note that they anticipate their “early enforcement action to focus on ensuring providers are adequately assessing risk and putting in place the measures that will have the greatest impact of children’s safety”.

In other news… 

But enforcement by Ofcom is not the only form of legal challenge under the OSA. On 8 May 2025, it was reported that Wikimedia is challenging parts of the so-called "categorisation regulations" that set out how Ofcom will decide which services will have to follow the further duties that only applied to categorised services. Wikimedia argues that, as currently defined, they risk not only inappropriately catching sites such as Wikipedia but also missing some platforms which, in their view, should be abiding by the tougher rules.

Wikimedia is seeking to challenge regulations set by the Secretary of State on the advice of Ofcom, after they had conducted research and consultation on where those thresholds should be set. This judicial review may lead to a delay in Ofcom finalising the register of which services will be categorised. 

In the meantime, the wheels must keep on turning

It remains to be seen how Ofcom’s enforcement appetite and priorities will play out in a shifting global regulatory landscape, but given the potential fines and intrusive regulatory powers in Ofcom’s toolbox, in-scope services should be continuing to implement measures to comply with the OSA and monitoring Ofcom’s priorities and enforcement work closely. 

 

 

Subscribe to our Tech Insights blog for insights, updates and news from our experts - subscribe now!

Tags

online safety