Summary
- A survey of children and parents found 71% of children had experienced harm online, yet only 36% of those who had been harmed reported it to the platform
- Children cited confusing tools, unclear language, and lack of trust in platforms as key barriers to reporting
- The gap between how often harm occurs and how often it is reported means the true scale of online harm is likely being underestimated, reducing pressure on platforms to respond
- While Ofcom’s Protection of Children Codes are expected to improve the reporting and complaints process, parents and children support further actions, including an independent complaints body to deal with issues
Today, we’ve published our latest research exploring children’s experiences of online harm and the reporting processes intended to protect them. The findings show how while a significant number experience harm, significantly less report it.
The survey of 1,000 UK children aged 9-17 and 2,000 parents found that 7 in 10 children reported experiencing online harm, such as contact from strangers, hate speech and misinformation. Yet just over a third of those affected (36%) took action by reporting it. For certain types of harm, reporting is even lower – just 18% who saw content promoting dangerous stunts or challenges and only 23% of those exposed to misinformation, reported it.
Barriers to reporting included complexity and lack of trust. Only 54% of children agree the reporting process is clear and in language they can understand, while 35% cited too many steps or confusing categories (31%) as barriers to reporting, as well as concerns about anonymity, particularly when reporting about someone they knew offline.
Of those who had reported content or users to a platform, 83% said they found the process easy and 66% were happy with the outcome. Challenges persist however, as the majority of children (60%) still encounter challenges when reporting to a platform: 28% said they were never kept updated on the outcome, and 11% did not receive any support or resources throughout the process.
Most children know how to report but many feel it’s not worth doing, with 4 in 10 children agreeing with the statement ‘platforms don’t respond to reports or take too long to respond’. When reporting rates are low this can mask the true scale of online harm, reducing the pressure on platforms to act. Without better visibility, platforms and regulators risk missing these issues, and children continue to face harm without support or redress.
Parents are also frustrated. When they disagree with the outcome of a complaint half (50%) would want the ability to escalate complaints, and 82% of parents think they should be able to report online issues to an independent body rather than the platform it’s self – similar to Australia’s eSafety Commissioner which can direct an online or electronic service or platform to remove harmful content within 24 hours.
While specific measures in the recently published Protection of Children and Illegal Harms Codes address some of the needs and concerns raised by parents and children in the research, more can be done, including measures to increase transparency and creating standardised reporting categories.
Katie Freeman-Tayler, Head of Policy and Research at Internet Matters said: “Too many children are suffering in silence and not reporting issues when they arise. It’s not enough for reporting tools to exist, they must work for children. Platforms must do more to make the process easier and more transparent, and parents need independent routes to escalate concerns when systems fail. The Online Safety Act is a step forward, but there’s still a long road ahead to make the internet truly safe for children.”
With the Codes due to come into force this summer, parents can explore our website to find out more about blocking and reporting content on platforms, as well as where to go for specialised support beyond the platforms themselves. Parents can also find free practical resources and guides to support all aspects of their children’s online safety.