Learn about it
Find out about the type of inappropriate content that your child may see across the platforms and apps they use and what the challenges are to keeping them safe.
What’s on the page
As children become more active online at a younger age, the possibility and probability that they’ll see something inappropriate all depends on what they’re doing online.
Whether it’s an explicit pop-up ad on a free game, videos showcasing children’s cartoon characters in adult situations, or a forum promoting self-harm, an innocent search can expose children to content that can make them feel upset and confused.
As the internet holds a lot of content, some of which is adult in nature, it's possible that children may stumble across things that are not suitable for their age or stage of development.
Stats show that 63% of teens believe that accessing inappropriate content online accidently is an issue
Also, what you think is inappropriate for your child may differ from a child’s view and will depend on their age and maturity level
In summary, inappropriate content consists of information or images that upset your child, material that’s directed at adults, inaccurate information or information that might lead your child into unlawful or dangerous behavior.
Accessing inappropriate content is possible on any internet enabled device. Your child may stumble upon unsuitable material on websites, apps, links sent by friends, and or while chatting to others online.
What online activities can increase the possibility and probability that my child will see inappropriate content?
- Joining social networks before reaching the minimum age
- Playing games and using apps which are not age-appropriate
- Watching live streams which may show inappropriate content or taking part in them and unconsciously being exploited
56 % of 11-16 year olds have seen explicit material online online source
One-third of British children 12-15 have encountered sexist, racist or discriminatory content online source
One in ten children aged 8 -11 who go online said they had seen something nasty or worrying online online source
From research, we know that as children become more active online it’s highly likely that they’ll see something that they may not be able to process and in many cases may not tell anyone about what they’ve seen. According to research from LGfL – Hopes and streams, one out of 5 children said that they had never told anyone the worst thing that had happened to them.
What you think is inappropriate material for your child will probably differ from your child’s view or that of other parents. It will also depend on your child’s age and maturity level.
Inappropriate content includes information or images that upset your child, material that’s directed at adults, inaccurate information or information that might lead or tempt your child into unlawful or dangerous behaviour. This could be:
- Pornographic material
- Content containing swearing
- Sites that encourage vandalism, crime, terrorism, racism, eating disorders, even suicide
- Pictures, videos or games which show images of violence or cruelty to other people or animals
- Gambling sites
- Unmoderated chat rooms – where there’s no one supervising the conversation and barring unsuitable comments.
- Sexism or sites that portray females in very traditional roles that do not reflect contemporary values and expectations
There are a number of ways that you can work out whether a piece of content is suitable for your child. Many platforms use a type of rating to advise on the level of violence and explicit content that a piece of media contains.
In the UK the British Board of Film Classification helps to regulate and classify the content of films shown in the cinema but it is now also working on legislation to restrict access to online pornography by requiring commercial pornography websites to introduce age verification to keep out under 18s.
While this is being worked out, here are just some things you can watch out for to help you make an informed choice on whether an app, website or piece of content is suitable for your child.
The BBFC ran a pilot to rate online music video in the same way that films get rated to help parents and children make an informed decision on whether the content they were watching was age appropriate. The ratings appear on the two biggest video sharing platforms VEVO and YouTube.
- On YouTube, you’ll see the ‘Partner Rating’ label on the video below the video which will show whether the video is PG, 12, 15, or 18.
- On VEVO you’ll you see the ratings symbol on the top left-hand corner of the video player when it video loads. You can also click on the ‘i’ for more information about the rating.
Although these ratings are helpful to determine whether an app is appropriate for your child, in some cases the ‘age rating’ placed on an app may not be a true reflection of the level of risks that your child may be exposed to adult content. In addition to being guided by the age ratings, it’s always a good idea to explore the app together with your child and read up on it to have more insight.
Pan European Game Information or PEGI are used to advise on which video games are only suitable for older or younger teens or adults only due to the type of content they have.
Pegi ratings were introduced in 2003 and range from PEGI!, PEGI 3, PEGI 7, PEGI 12, PEGI 16 and PEGI 18. The number relates to the age that the game is appropriate for. So, if a game has a PEGI 7 rating, it is suitable for children 7 and over. These ratings are legally enforceable so it is illegal to sell a PEGI 18 game to a child. Also, these ratings do not relate to the level of difficulty of the game just the level of appropriate content
Most social media platforms terms and conditions advise that children should be 13 and over to use the platforms. The reason for this minimum age is not to do with the fact that the content on the platform is only suitable for 13 and over but due to COPPA (Children’s Online Privacy Act) which is a US law passed to protect the privacy of under 13s).
Recently, the General Data Protection Regulation (GDPR) was also introduced to ensure all organisations that are collecting data from children under 13 get parental consent before children start using their services. Since the change, a number of social media platform have amended their terms and condition to comply. WhatsApp actually sought to change its minimum age to 16 rather than 13.
Therefore, it is still important to consider explore any social media platform that your child is planning to use to get familiar with they might see and what the safety and privacy features it has to protect them to make an inform choice on whether they are ready to use it.
It can be difficult to monitor what your child is viewing as they can access this material through any internet enabled device, including mobile ones such as a phone or tablet.
Sometimes your child may stumble upon unsuitable sites by accident, through apps they’ve downloaded to their mobile device or through links they’ve been sent by friends, chatting to others online, or even through inter-device communication systems such as Bluetooth or Apple’s AirDrop.
Although there are a number of tools you can use to closely monitor what they are doing on their device and block access to certain content through filters, preparing your child for what they might see is vital to make sure they know how to deal with it if they see something they shouldn’t.