Internet Matters
Search

Rising harms, new rules: Why the Online Safety Act matters

Mitali Sud | 23rd June, 2025
A mother looks at a laptop with two daughters inside a homemade blanket fort.

Ahead of the Online Safety Act’s enforcement in July 2025, 77% of UK children aged 9 to 17 have experienced harm online. This is an 8% increase since last year, according to the latest Internet Matters Pulse.

Summary

How will the Children Codes impact kids’ online safety?

In July 2025, the Protection of Children Codes (“Codes”) are expected to come into force. Ofcom developed them as part of the Online Safety Act.

Technology companies in the UK have a new responsibility. They must find out the risks their platforms create for children. They also need to assess and reduce these risks.

The Codes explain how services should stop or limit access to harmful content. They also ensure that children’s safety is part of their design and operation from the start.

Internet Matters welcomes this important and long-anticipated milestone in the UK’s approach to children’s online safety. Yet, our latest Pulse data suggests the scale and urgency of the challenge is only increasing. 77% of children aged 9 to 17 have experienced harm online. This is an increase of 8% in the last year.

This article explains what our data outlines about children’s experiences. It also highlights why we need to treat the Codes as a starting point, not a final step. It shows where regulators and the Government need to do more.

What Internet Matters Pulse reveals about the volume of online harm

Our Pulse findings show that children continue to face a wide range of harms online. These include exposure to violent content, bullying and unwanted contact from strangers. We track some harms, like too much screen time, that the Online Safety Act does not cover. However, many do fit within the law’s content categories.

For example, 9% of children report seeing pornographic content. This equates to approximately 663,000 of children aged 9 to 17 across the UK based on 2023 population estimates from the Office for National Statistics. The Act classifies this as Primary Priority Content (PPC). Platforms must actively prevent children from accessing this content using highly effective age assurance.

Meanwhile, 23% of children (or roughly 1.7 million) say they’ve seen content promoting dangerous stunts or challenges. This is a form of Priority Content (PC) that must also be age-restricted under the Codes.

The chart below captures the range of harms reported in our research.

Why demographics and experience matter

Online harms are not experienced equally. We need to determine who the issue harms and how it harms them. This matters just as much as knowing how many children the issue affects. Age, gender, socio-economic background and other factors shape the likelihood, type and severity of the risks children face.

Our latest research shows that while girls are more likely than boys to come across unrealistic body image content (22% of girls cf. 16% of boys), older children are more likely to be contacted by strangers than younger children (28% of 13- to 17-year-olds cf. 23% of 9 – to 12-year-olds).

Similarly, we find significant difference between those with vulnerabilities and those without. Vulnerable children are more than twice as likely to experience online bullying from people they know. This is compared to their non-vulnerable peers (18% cf. 7%).

To be effective, the UK’s online safety regime must respond to demographic-specific risks. One immediate action that can be taken to recognise this is turning Ofcom’s guidance on Violence Against Women and Girls (VAWG), which is currently non-statutory, into a statutory Code of Practice. Without a statutory footing, platforms do not need to implement the measures that the guidance recommends. This leaves continued gaps in protection for those at risk.

Platforms must create age-appropriate experiences

When we consider the fact that children have different experiences, at different ages, it becomes even more important that they are only accessing platforms and services that are age appropriate, along with the content they are being shown. While the Act attempts to address the latter through its Codes, with recognition that providers should support children to have age differentiated online experiences, this principle is undermined by the lack of enforcement of minimum age requirements.

Currently, the Codes only “strongly encourage” platforms to uphold their own minimum age terms. It does not require platforms to implement measures to enforce them. This is despite data showing that children under 13, the minimum age for most major platforms, are regularly accessing services well before they are meant to. This exposes them to harmful content and contact.

We recognise that legislation limits Ofcom’s powers in this area. But if minimum age requirements are to serve any protective purpose, the Government must step in. Whether through new legislation or statutory guidance, they must mandate meaningful enforcement of minimum age. This will in turn support children having online experiences that are age appropriate.

How do children and parents feel about the online world?

Our Pulse survey also asked parents and children whether they believe the internet is becoming safer for children. While parents are more optimistic than children (30% of children cf. 14% of parents), in general more parents and children disagree with this statement (45% of parents; 46% of children). Meaningful implementation of the Codes should help shift public perception and deliver tangible improvements to children’s safety online.

What comes next in online safety policy?

Children and parents disagree that the online world is getting safer for children. We are hopeful that the Online Safety Act will drive positive change for children online. But regulation cannot remain static. Children’s online lives evolve rapidly, and so too must the policy response. A lasting, effective safety framework must be evidence-led, dynamic and rooted in children’s lived experiences.

To that end, Internet Matters is calling on Ofcom and the Government to:

At Internet Matters, we will continue to monitor how the Codes are implemented and assess their impact. We dedicate ourselves to sharing real insights through Internet Matters Pulse. These insights can help improve online safety for all children.

Find out more

Get personalised advice and ongoing support

The first step to ensure your child’s online safety is getting the right guidance. We’ve made it easy with ‘My Family’s Digital Toolkit.’