Internet Matters
Search

5 ways the Online Safety Act is changing how children experience the online world

Mitali Sud | 4th November, 2025
Three children look at a smartphone together.

It’s been three months since measures to make the online world safer for children came into force as part of the Online Safety Act’s Protection of Children Codes. In this time there has been plenty of discussion about the impact of the Act, with some raising concerns about privacy and others arguing it doesn’t go far enough to protect users.

While it is still early days and the full picture of its impact will take time to emerge, there are already some encouraging signs that the new rules are starting to make a difference for families.

Summary

Here are 5 ways the Act is changing how platforms operate, and what that means for children and parents.

Scroll to explore the full article or click a link below to jump to the section.

  1. Many platforms are using age verification to protect children
  2. Terms of service are getting easier to understand
  3. Algorithms are being redesigned to reduce harms
  4. Platforms are improving parental controls
  5. Platforms are being held accountable

1. Many platforms are using age verification to protect children

The Act requires pornography websites to stop under-18s from accessing their content. This marks a major cultural shift: for the first time, adult content online is being age-restricted in the same way it is offline. These sites must now use “highly effective” tools to confirm a user’s age before they access adult content, which means not just clicking a tick box or entering a date of birth.

To do this, some pornography websites have begun using third-party services such as VerifyMy. These services employ privacy-preserving techniques to confirm a user’s age with existing data (for example an email address) without sharing personal information with the platform itself.

What’s the impact on social media and online games?

Some social media and gaming platforms have also begun using such tools to verify whether users are under 18. These tools help platforms comply with the Act by allowing them to tailor experiences for children. For example, they can limit children’s access to legal but harmful content and apply stronger safety settings for child accounts.

How do people feel about age verification?

While there was some resistance to the use of highly effective age assurance, most of the UK public say they support these measures. 80% of parents tell us they are comfortable with children having to verify their age to sign up to apps or platforms, especially where it protects them from harmful content.

Since the new laws came into effect, Ofcom has been monitoring compliance closely. The ten most-visited pornography sites, including Pornhub, along with many smaller ones, have introduced highly effective age checks for UK users.

2. Terms of service are getting easier to understand

The Act requires platforms to make their terms of service clearer and more child-friendly. They should be written at a level the youngest user allowed on the service can understand.

TikTok is a good example of this in practice. Its new UK Terms of Service, created in line with the Act, are written in plain English and include an annex explaining how TikTok protects children, how to report harmful content and what happens once a report is submitted. Compared with TikTok’s US Terms of Service, the difference is striking.

A screenshot from the TikTok Terms of Service in the UK.
TikTok’s UK Terms of Service, showing clearer language and structure.
A screenshot from the TikTok Terms of Service in the US.
TikTok’s US Terms of Service, which still uses more complex language.

The UK version:

This move towards greater transparency means that children and parents can better understand the rules they are agreeing to when they use an app.

3. Algorithms are being redesigned to reduce harm

Under the Act, platforms that recommend or promote content to users (for example, through “For You” feeds or autoplay features) must ensure those systems don’t push harmful material to children.

This means that platforms should automatically prevent children from seeing not only illegal content but also material that might be legal but harmful such as videos promoting dangerous stunts. If a child does come across such content, the platform should avoid showing more of the same and instead vary recommendations or offer more positive alternatives.

It’s still early days, but there are indications children are seeing a difference. Sky News spoke to six teenagers before and after the new rules came into force, and five said they noticed less harmful content being shared on their feeds.

Tip for parents

Encourage children to use features that let them like, dislike, or down-rank videos. These actions help teach the algorithm what they want to see more or less of, helping build a safer and more positive feed.

4. Platforms are improving parental controls

While there is currently no requirement for platforms to offer parental controls, the regulator has identified this as one of its top five priorities for the next phase of its children’s safety work.

In the meantime, some platforms have taken voluntary action. Roblox, for example, launched parental insights for parents of teens, allowing parents to more easily oversee their teen’s in-game settings. The company has also committed to introducing age estimation for all users and parental verification for younger players by the end of this year.

These steps show growing recognition that parents play a vital role in children’s online safety, and they need better tools to do so.

5. Platforms are being held accountable

Perhaps most importantly, the new rules come with real consequences, such as financial penalties, if platforms don’t comply with them.

As of October 2025, Ofcom has opened 21 investigations covering 69 sites and apps to assess how they are complying with the Act. These include file-sharing services, pornography sites and online forums. Some investigations have already led to improvements.

For example, file-sharing services that were being used to distribute child sexual abuse material have now installed automated technology to detect and remove that content following investigations by Ofcom.

Ofcom has also issued fines against companies that refuse to cooperate. The provider of social media platform 4chan, for instance, was fined £20,000 in October 2025 for failing to provide legally required information, and the company faces a daily penalty until it complies.

Some services, such as an online suicide forum, have chosen to block UK users entirely rather than meet the new safety requirements. Ofcom says these sites remain on its watchlist to ensure they don’t encourage users to bypass restrictions — a sign that enforcement is being taken seriously.

What happens next and what parents can do now

The rules mentioned above mark the start of an ongoing process. Ofcom will continue to update its Codes of Practice as it monitors how platforms are complying with the Act, responds to new technologies and gathers evidence from organisations like ours about what works to keep children safe.

That is important, because while there have been early signs of progress, children are still experiencing harm online. This suggests that platforms may not yet be fully meeting their duties, or that Ofcom’s rules may need to go further.

At Internet Matters, we will continue monitoring how the Act is being implemented and pushing for stronger, evidence-based protections to support children and empower parents.

We know that parents are children’s main source of online safety information, and need clear, practical tools to help navigate the digital world confidently and safely.

What you can do now

Supporting resources

About the author

Mitali Sud

Mitali Sud

Senior Policy Manager at Internet Matters

Mitali leads the development of policy solutions at Internet Matters, advocating for a safer online environment for children.

A family sits on their sofa, holding various devices and a dog sitting at their feet

Get personalised advice and ongoing support

The first step to ensure your child’s online safety is getting the right guidance. We’ve made it easy with ‘My Family’s Digital Toolkit.’

Secret Link