Internet Matters
Search

Me, Myself and AI research

Understanding and safeguarding children’s use of AI chatbots

Explore children’s interactions with AI chatbots as both tools and companions, along with the benefits and risks.

A boy lying on his bed using his phone, while his laptop sits open in front of him

Key findings

  • Learning and schoolwork: Nearly half of children using AI chatbots say they use the tools for schoolwork. This includes help with revision, writing support and learning new concepts.
  • Advice-seeking: Almost a quarter of children who use AI chatbots say they’ve used them to seek advice. Advice ranges from asking what they should wear to practising conversations they want to have with friends.
  • Companionship: The research also found children using AI chatbots for connection and comfort, including to simulate friendship. Some say they do this out of boredom, to discuss topics they’re interested in or because they have no one else to talk to.
  • Inaccurate and insufficient responses: AI chatbots fail at times to support children with clear and comprehensive advice, which is concerning as more than half of children using chatbots say using the tool is better than searching for something themselves.
  • High trust in advice: 2 in 5 children who use AI chatbots have no concerns about following the advice they receive, which is even higher among vulnerable children (50%). This is true even when advice is contradictory or unsupportive.
  • Exposure to harmful content: Despite AI chatbot providers knowing users’ age and prohibiting age-inappropriate content, children are still encountering harmful responses.
  • Blurred boundaries: Some children see AI chatbots as human-like and refer to them with gendered pronouns. Experts suggest children may become more emotionally reliant on AI chatbots with increased use.
  • Conversations with parents: While a majority of children say a parent has spoken to them about AI in general, many parents haven’t shared their concerns. Parents’ top concerns include children becoming over-reliant, accuracy of information and too much time spent on chatbots.
  • Conversations with teachers: Education about AI in schools is inconsistent and at times contradictory from teacher to teacher. Most children who have spoken to teachers about AI do not recall having multiple conversations.
  • Support for AI education: Children are supportive of schools educating them on using AI chatbots. They felt it could support schoolwork while addressing risks like inaccuracy, over-reliance and privacy.
  • Industry: Platforms need to adopt a safety-by-design approach to create age-appropriate AI chatbots to support children’s needs. This should include built-in parental controls, trusted signposts and media literacy features.
  • Government: The Online Safety Act needs to include clear guidance on how AI chatbots are covered. AI chatbots not built for children need to have effective age assurance to keep pace with rapidly evolving AI technologies.
  • Supporting schools: Government also needs to embed AI and media literacy at all key stages. This includes effective teacher training and clear guidance on appropriate AI use.
  • Supporting parents and carers: Parents/carers need support on guiding their child’s use of AI. They should feel confident to talk about what AI chatbots are, how they work and when to use them.
  • Policymakers: Children’s voices need to be at the centre of decisions around development, regulation and governance of AI chatbots and AI in general. This includes investing in long-term research on the impacts on childhood.

Me, myself and AI: The full report

Supporting research and resources

Explore more of our research along with resources designed for parents, carers and professionals.

close Close video
close Close video
close Close video
close Close video