This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 3 minute read

Age gate or make it child-safe: The choice facing online platforms in the UK

At some point in 2025, many online platforms in the UK will face a stark choice. Either introduce age assurance technology to stop under-18s accessing the service. Or implement measures to make the entire service “safe” for under-18s: by taking steps to prevent users encountering content that is harmful to children (pornography, content that encourages suicide, self-harm, eating disorders, abuse, bullying, instructions for violence etc.).

That age-assurance technology will need to be “highly-effective”. This might mean using methods such as photo-ID matching, facial age estimation or digital identity services. Asking your users to self-declare their age or putting a condition in your terms of service that a child is not allowed to access the platform won’t cut it.

That is the (slightly simplified) upshot of Ofcom’s major new consultation, published earlier this week, on how it will implement the “children’s duties” in the UK’s Online Safety Act 2023 (OSA).

And the measures apply to more than just social media platforms. All “user to user” and “search” services will need to comply. These definitions are broad enough to cover social media, video sharing, private messaging, gaming, dating, review services, file and audio sharing, discussions forums, web search and more.

Quick recap: How did we get here?

This is the latest consultation by Ofcom on how it will implement the UK’s ambitious OSA.  After a long gestation period, the OSA finally passed in October 2023 (The Dawn of a New Era: The UK’s Online Safety Act is here!, Jemma Purslow, Ben Packer, Georgina Kon (linklaters.com)).

The OSA is aimed at making the internet safer for individuals in the UK. It does this by imposing a range of duties on “user to user services”, “search services” and providers of pornographic content. In broad terms, these require those services to identify and mitigate harm from illegal content and content that is harmful to children.

Ofcom has already consulted on its proposals concerning illegal content (The rubber hits the road: Ofcom launch their first consultations on the OSA 2023, Georgina Kon, Ben Packer, Jemma Purslow, Ria Moody (linklaters.com)). It is also in the process of calling for evidence on the additional duties that will apply to what are termed “categorised services”, the circa 35-60 services that have certain functionalities and large numbers of users (Call for evidence: Third phase of online safety regulation - Ofcom).

This latest consultation concerns the “children’s duties” in the OSA. By “children,” the OSA means anyone under the age of 18.

What do the proposals cover?

Under the OSA, services will need to essentially do three things in relation to children:

  1. Assess whether children can access their service or part of it.
  2. If they can, complete a risk assessment to identify the risk their service poses to children.
  3. Implement safety measures to mitigate those risks to children.

Ofcom’s consultation sets out what they will expect of platforms in all three areas. In the table below we have summarised some of the proposals that struck us as being most notable.

Task

Notable aspects of Ofcom's consultation

1. Assessing whether children can access the services

To assess whether children can access the service, providers must:

  1. Determine whether it is possible for children to access their service or part of it. Providers can only conclude that children cannot access the service if they use age assurance measures that prevent children from accessing the service.
     
  2. If so, determine whether there are significant numbers of children using the service and/or the service is likely to attract a significant number of children. Without age assurance, providers may not be able to determine if a significant number of children are using the service. Therefore, Ofcom say that providers might want to focus on the second criterion. This involves considering whether the service benefits children (e.g. entertainment, education, support), has content that appeals to children, has a design that appeals to children, or if child users form part of the service’s commercial strategy (for example if ads targeted at children are a revenue stream).

The outcome of the assessment must be recorded, along with detailed evidence.

2. Completing the children's risk assessment

 

To complete their risk assessment, services: 

  • must consult Ofcom’s Children’s Risk Profiles;
     
  • should follow Ofcom’s proposed four-step methodology.

3. Implementing safety measures to protect children

Ofcom are proposing more than 40 safety measures that they expect online services to take to protect children. These fall into broadly three areas: (i) implementing robust governance and accountability; (ii) safer platform design choices; and (iii) providing children with information, tools and support.

Most notably, these proposals include:

  • all user-to-user and search services naming a person accountable to the most senior governance body for compliance with children’s safety duties;
     
  • implementing age assurance technology (as described at the start of this article);
     
  • calibrating recommender algorithms to ensure certain types of content are not recommended to children; 
     
  • providing children with options to accept or decline invitations to group chats;

search engines providing crisis prevention information in response to certain search requests (e.g. suicide, self-harm and eating disorders).

What’s next?

The consultation documents are vast, totalling several hundred pages. Ofcom are inviting responses to their consultation by 17 July 2024 and are expecting to finalise their proposals in spring 2025. The duties will then come into force within a few months.

Given just how significant the proposals are, those in scope will want to ensure that their voice is heard before Ofcom moves to bring these into effect.

Tags

online safety