The code requires companies that design apps, social media platforms, online games and connected toys to have children’s privacy as a primary consideration.
The idea of the code was introduced as part of the Data Protection Act 2018 and the ICO, whose job it is to make sure companies don’t misuse personal data, was charged with drawing it up and implementing it.
Why is it important?
There was a time when we tried to protect young people from the internet by putting up barriers and controls, but that means kids miss out on all the amazing things you can do online. It has become the place to learn, to play and to connect with friends and family. As a parent, I talk with my daughter about online safety and data protection. We don’t want to put barriers up that would stop children from accessing the opportunities the internet can provide. The code protects children within the digital world rather than protecting them from it.
How the code works
The code says that companies designing platforms or apps likely to be accessed by children must follow 15 standards, including:
- design services to be age appropriate and in children’s best interests;
- consider whether the use of children’s data keeps them safe from commercial and sexual exploitation;
- provide a high level of privacy by default;
- stop using design features like nudges that encourage children to provide more data;
- switch off by default geo-location services that track where they are based, and profiling services that track a child’s online behaviour; and
- map what personal data they collect from UK-based children.
The code is already having an impact.
Facebook, Google, Instagram, TikTok and others have all recently made significant changes to their child privacy and safety measures. As examples, YouTube have blocked ad targeting and personalisation for all children and Instagram are now preventing adults messaging children who do not follow them, defaulting all child accounts to private.
As the first-of-its kind, the code is also having an influence globally. Members of the US Senate and Congress have called on major US tech and gaming companies to voluntarily adopt the standards in the ICO’s code for children in America. And the Irish Data Protection Commission is also preparing similar regulations.
We have identified that some of the biggest risks come from social media platforms, video and music streaming sites and video gaming platforms. In these sectors, children’s personal data is being used and shared, to bombard them with content and personalised service features. This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online. We’re concerned with several harms that could be created because of this data use, which are physical, emotional, psychological and financial.
Children’s rights must be respected, and we expect organisations to prove that children’s best interests are a primary concern. The code gives clarity on how organisations can use children’s data in line with the law, and we want to see organisations committed to protecting children through the development of designs and services in accordance with the code.
So, we’ll be proactive in requiring social media platforms, video and music streaming sites and the gaming industry to tell us how their services are designed in line with the code.
We’ll help organisations that need further support and, if needed, we can use our powers to take action.
How you can help
If a parent or child has an issue with what they have seen online, they should in the first instance report it to that online provider. If they need additional support from the ICO after they have done this, they should contact us to make a complaint.