skip navigation
skip mega-menu

Online age checks and children's safety measures must be in place from 25 July

From Friday 25 July, tech platforms must have measures in place to make sure that children cannot access porn and harmful material.  It is the final deadline for in-scope tech services to conduct children's risk assessments and adopt measures to protect children online.

This follows the publication of the Protection of Children Codes of Practice and the Children's Risk Assessment Guidance in April. If in-scope platforms and services have not complied by tomorrow, then they will risk severe sanctions including penalties of up to £18 million or 10% of global turnover, whichever is greater. 

In-scope services and platforms should have conducted appropriate Children's Access Assessments back in April to determine whether their service (or any part of it) was capable of being accessed by children. If these assessments found that such services were likely to be accessed by children, then the OSA's children's safety duties will apply. 

These include taking proportionate measures to prevent under 18s from encountering harmful content, mitigating and managing risks to children in the different age groups identified, implementing easy-to-use complaints and content reporting procedures and clearly communicating how the service is managing the risks of harm. Ofcom has stated that measures to achieve these obligations include employing highly effective age verification, content moderation, adopting robust internal governance; having solid terms and conditions which are capable of being understood by children; enabling reporting and complaints procedures which are easy for children to use and providing user controls including blocking and muting functionalities.  

And whilst services work hard to understand and implement these new measures, it looks like more compliance obligations are on the horizon as Ofcom has already launched a new consultation on additional safety measures which seeks to further strengthen those contained in the existing codes including the Protection of Children Codes of Practice.

Enforcement programmes

Ofcom is extending its existing age assurance enforcement programme, which was previously focused on studio porn services – to cover all platforms that allow users to share pornographic material, whether they are dedicated adult sites or other services that include pornography.

It is also launching a new age assurance enforcement programme, building on work undertaken by its "small but risky taskforce"' to monitor the response from industry. This will specifically target sites dedicated to the dissemination of harmful content, including self-harm and suicide, eating disorders or extreme violence/gore.

Finally, it is launching a monitoring and impact programme, primarily focused on the biggest platforms where children spend most time – including Facebook, Instagram, Roblox, Snap, TikTok and YouTube. This will include: 

  • a comprehensive review of these platforms' efforts to assess risks to children, which must be submitted to Ofcom by 7 August at the latest. It will report on its analysis of these assessments later this year;
  • scrutinising these platforms' practical actions to keep children safe – details of which must be disclosed to Ofcom by 30 September. It will interrogate in particular: whether they have effective means of knowing who their child users are; how their content moderation tools identify types of content harmful to children; how effectively they have configured their algorithms so that the most harmful material is blocked in children's feeds; and how they have prevented children from being contacted by adult strangers;
  • tracking children's online experiences to judge whether safety is improving in practice - through its ongoing programme of children's research and consulting with children through new work with the Children's Commissioner for England; and
  • swift enforcement action if evidence suggests that platforms are failing to comply with their child safety duties.

Repeal of video-sharing regime

In addition, the Online Safety Act 2023 (Commencement No. 6) Regulations 2025 have been made.  They repeal the video-sharing regime with effect from 25 July. From now on, video-sharing platforms will be solely regulated under the OSA and are subject to each of the relevant duties of care set out in Part 3 of the OSA.


Authors

Laura Harper (Partner)
laura.harper@lewissilkin.com

Helen Hart (Managing Knowledge Lawyer)
helen.hart@lewissilkin.com


Subscribe to our newsletter

Sign up here