Ahead of the Online Safety Act’s enforcement in July 2025, 77% of UK children aged 9 to 17 have experienced harm online. This is an 8% increase since last year, according to the latest Internet Matters Pulse.
Summary
- The Online Safety Act’s Protection of Children Codes places statutory duties on platforms operating in the UK to protect children online.
- Internet Matters Pulse shows that online harms remain widespread and growing. Three in four children aged 9 to 17 report enduring at least one harmful online experience.
- Children’s experiences vary by age, gender and socio-economic background. This highlights the need for a safety framework that responds to demographic-specific risks.
- Statutory action is needed to close two key gaps: the non-binding status of Ofcom’s Violence Against Women and Girls (VAWG) guidance, and the lack of mandated enforcement of minimum age requirements.
- Children and parents disagree that the online world is getting safer for children. This underlines the importance of effective implementation of the Act and visible progress.
- Effective regulation must be future-proofed with further research with children’s lived experiences.
How will the Children Codes impact kids’ online safety?
In July 2025, the Protection of Children Codes (“Codes”) are expected to come into force. Ofcom developed them as part of the Online Safety Act.
Technology companies in the UK have a new responsibility. They must find out the risks their platforms create for children. They also need to assess and reduce these risks.
The Codes explain how services should stop or limit access to harmful content. They also ensure that children’s safety is part of their design and operation from the start.
Internet Matters welcomes this important and long-anticipated milestone in the UK’s approach to children’s online safety. Yet, our latest Pulse data suggests the scale and urgency of the challenge is only increasing. 77% of children aged 9 to 17 have experienced harm online. This is an increase of 8% in the last year.
This article explains what our data outlines about children’s experiences. It also highlights why we need to treat the Codes as a starting point, not a final step. It shows where regulators and the Government need to do more.
What Internet Matters Pulse reveals about the volume of online harm
Our Pulse findings show that children continue to face a wide range of harms online. These include exposure to violent content, bullying and unwanted contact from strangers. We track some harms, like too much screen time, that the Online Safety Act does not cover. However, many do fit within the law’s content categories.
For example, 9% of children report seeing pornographic content. This equates to approximately 663,000 of children aged 9 to 17 across the UK based on 2023 population estimates from the Office for National Statistics. The Act classifies this as Primary Priority Content (PPC). Platforms must actively prevent children from accessing this content using highly effective age assurance.
Meanwhile, 23% of children (or roughly 1.7 million) say they’ve seen content promoting dangerous stunts or challenges. This is a form of Priority Content (PC) that must also be age-restricted under the Codes.
The chart below captures the range of harms reported in our research.
Why demographics and experience matter
Online harms are not experienced equally. We need to determine who the issue harms and how it harms them. This matters just as much as knowing how many children the issue affects. Age, gender, socio-economic background and other factors shape the likelihood, type and severity of the risks children face.
Our latest research shows that while girls are more likely than boys to come across unrealistic body image content (22% of girls cf. 16% of boys), older children are more likely to be contacted by strangers than younger children (28% of 13- to 17-year-olds cf. 23% of 9 – to 12-year-olds).
Similarly, we find significant difference between those with vulnerabilities and those without. Vulnerable children are more than twice as likely to experience online bullying from people they know. This is compared to their non-vulnerable peers (18% cf. 7%).
To be effective, the UK’s online safety regime must respond to demographic-specific risks. One immediate action that can be taken to recognise this is turning Ofcom’s guidance on Violence Against Women and Girls (VAWG), which is currently non-statutory, into a statutory Code of Practice. Without a statutory footing, platforms do not need to implement the measures that the guidance recommends. This leaves continued gaps in protection for those at risk.
Platforms must create age-appropriate experiences
When we consider the fact that children have different experiences, at different ages, it becomes even more important that they are only accessing platforms and services that are age appropriate, along with the content they are being shown. While the Act attempts to address the latter through its Codes, with recognition that providers should support children to have age differentiated online experiences, this principle is undermined by the lack of enforcement of minimum age requirements.
Currently, the Codes only “strongly encourage” platforms to uphold their own minimum age terms. It does not require platforms to implement measures to enforce them. This is despite data showing that children under 13, the minimum age for most major platforms, are regularly accessing services well before they are meant to. This exposes them to harmful content and contact.
We recognise that legislation limits Ofcom’s powers in this area. But if minimum age requirements are to serve any protective purpose, the Government must step in. Whether through new legislation or statutory guidance, they must mandate meaningful enforcement of minimum age. This will in turn support children having online experiences that are age appropriate.
How do children and parents feel about the online world?
Our Pulse survey also asked parents and children whether they believe the internet is becoming safer for children. While parents are more optimistic than children (30% of children cf. 14% of parents), in general more parents and children disagree with this statement (45% of parents; 46% of children). Meaningful implementation of the Codes should help shift public perception and deliver tangible improvements to children’s safety online.
What comes next in online safety policy?
Children and parents disagree that the online world is getting safer for children. We are hopeful that the Online Safety Act will drive positive change for children online. But regulation cannot remain static. Children’s online lives evolve rapidly, and so too must the policy response. A lasting, effective safety framework must be evidence-led, dynamic and rooted in children’s lived experiences.
To that end, Internet Matters is calling on Ofcom and the Government to:
- Make the VAWG guidance statutory: Platforms should be legally required to adopt Ofcom’s guidance on tackling violence against women and girls. This will ensure consistent action on gender-based harms.
- Mandate enforcement of minimum age requirements: The Government must close the legislative gap that allows underage access to platforms persist unchecked.
- Prioritise age and development appropriate experiences: Platforms should be expected to tailor their design and safety measures based on children’s age, stage, and specific vulnerabilities.
- Use data to future-proof regulation: Ofcom and the Government should support research with children. This includes input from civil society. This will help ensure the regulatory framework stays responsive to new risks.
At Internet Matters, we will continue to monitor how the Codes are implemented and assess their impact. We dedicate ourselves to sharing real insights through Internet Matters Pulse. These insights can help improve online safety for all children.