Internet Matters
Search

Understanding & improving how kids report online harm

Although blocking and reporting tools are widely available, research shows that children and young people often do not use them to manage harm they encounter online.

A girl uses a smartphone on her sofa.

Key findings

We conducted a UK nationally representative survey of 2,000 parents and 1,000 children. We also conducted qualitative research with children aged 15 and 16.

The findings are structured into four sections:

  • Prevalence of reporting: This section explores how often children report different types of harm. It also looks at the varying reporting rates across platforms.
  • Knowledge of blocking and reporting on platforms: In this section, we look at children’s understanding of how to block and report on various platforms. We also expand on what they believe happens following a submitted report.
  • Barriers to reporting: This section explores the reasons behind children not reporting harmful content to platforms.
  • Reporting processes and outcomes: Finally, this section looks at children’s and parents’ satisfaction with the reporting process. We also identify areas for improvement.

Summary of findings

  • 71% of children say they have experienced harm online. However, only 36% say they reported it to the platform.
  • Girls are more likely to report upsetting content while boys are more likely to report illegal content.
  • Children were more likely to report if it impacted them, their friends or their family.
  • 54% of vulnerable children reported a harm compared to 33% of non-vulnerable children.
  • Most children know how to report or block users. However, many are unsure what happens after they make a report.
  • Barriers to reporting include unclear language, too many steps and confusing categories. Some children also worry about anonymity and platform inaction.
  • 83% of children found the reporting process easy. However, 60% still encountered at least one challenge.
  • Most (79%) children support more education from schools around reporting.
  • 83% of parents have talked with their child about reporting. Younger users were also in favour of parents being able to report on their behalf.
  • 50% of parents agreed that they should have the ability to escalate a report. Additionally, 43% supported the option to file a complaint with an independent body like Ofcom.

Our recommendations

Building on the requirements under the Online Safety Act we have further evidence-based recommendations to improve the reporting process for children. Full recommendations are outlined in the briefing below.

Industry

  • Involve children in implementing Codes of Practice.
  • Make sure guidance is age-appropriate.
  • Use media literacy by design to educate children on how to use blocking and reporting tools.
  • Give parents clear and accessible information on how to report and block on their child’s behalf.
  • Platforms should prioritise reports that involve or come from children.

Government and the regulator

  • The government should require platforms to publish reporting data.
  • It should also offer alternative routes for reporting issues if a parent feels like the platform has not appropriately responded to a complaint.
  • Embed media literacy in the school curriculum throughout a child’s time in education.
  • Ofcom should continue to review and adapt Codes as new evidence emerges.

Read the full report brief

Supporting resources

close Close video
close Close video
close Close video
close Close video