Ofcom’s Protection of Children Codes - Happy Birthday to me 🎂

Who would have thought that my best birthday present would be Ofcom’s Children’s Codes?!

That’s right, I was busy opening a new pair of shoes that my wife had bought me on my birthday, the 24th of April, when something intriguing caught my eye.

It was as if Ofcom, the UK media regulator, knew it was my birthday, I can only assume they did, when they published the Protection of Children Codes in draft form as submitted to the Secretary of State.

Ofcom’s Protection of Children codes outline the regulatory measures under the UK’s Online Safety Act, aimed at safeguarding children from harmful online content. These codes will be effective from 25 July 2025 as long as the draft completes the parliamentary process.

For those of you who listen to the Giant Robots podcast you would have heard me talk about my own kids probably more often than you would like.

But with my oldest turning 9 we are just starting to enter the battlefield of the internet and you don’t need me to tell you that it can be a scary place out there…

A picture of a scary internet meme

However, the Children Codes offer some sliver of hope.

What the codes enforce:

Under the UK’s Online Safety Act, Ofcom’s children’s safety codes mandate that online services, likely to be accessed by children, must assess and mitigate risks of exposure to harmful content.

This includes implementing effective age assurance measures, conducting risk assessments, and adopting proportionate safety measures to protect children from content related to pornography, suicide, self-harm, and eating disorders.

Services are also required to provide clear terms of service outlining their safety measures, establish easy-to-use reporting mechanisms, and designate an individual accountable for compliance.

Who they affect:

These regulations apply to user-to-user and search services that are likely to be accessed by children in the UK, regardless of where the service is based.

This encompasses social media platforms, messaging apps, video-sharing services, and search engines.

Services must determine if children are part of their user base and, if so, carry out risk assessments and implement appropriate safety measures accordingly.

There is a really helpful tool on Ofcom’s website where you can check if you will be affected.

Consequences of non-compliance:

Failure to adhere to these codes can result in significant penalties, including fines up to £18 million or 10% of global revenue, whichever is greater. Persistent or serious breaches may also lead to criminal sanctions for senior managers responsible for compliance.


Now some people are arguing this does not go far enough and they may be right but at the very least it feels like as a society we are taking a step in the right direction.

Moreover, once you break down the recommended safety measures there are some really practical things that you can start doing to achieve compliance.

Access Assessment

  • What’s required: Determine whether children are likely to access the service (e.g. via sign-up flows, usage data, analytics).
  • Potential solution: Implement analytics dashboards and data tagging to monitor user demographics and behaviours; consider UX changes to deter underage use where appropriate.

Age Assurance

  • What’s required: Use effective methods to confirm users’ ages where harmful content could be accessed.
  • Potential solution: Integrate age assurance tools (e.g. facial estimation APIs, ID checks, credit card verification), or consider low-friction UX for age gating.

Content Moderation Systems

  • What’s required: Prevent children from encountering certain types of harmful content.
  • Potential solution: Build or fine-tune automated moderation pipelines using machine learning or keyword filtering. Adjust algorithms to filter out harmful content from children’s feeds.

Reporting and Complaints Mechanisms

  • What’s required: Let users (especially children) easily report harmful content and complaints.
  • Potential solution: Design and implement reporting UIs and backend processes; ensure accessibility and usability for children.

If you really want to do further reading there are some other recommendations but I can’t be expected to read all of them, it was my birthday after all!

The drafts themselves are split into two:

  1. Protection of Children Code of Practice for user-to-user services
  2. Protection of Children Code of Practice for search services

They are long and intense, you have been warned.

thoughtbot has a dedicated team in the UK who would be happy to answer your questions on the next steps as we see these recommendations and the Online Safety Act make their way through parliament, just don’t forget to wish me a happy birthday! 🥳