MENU

Parents and children say: Ban nudifying apps

A child holds and uses their smartphone.

In the past year, we have seen the rapid increase of nude deepfakes, with nudifying tools easily accessible online.

This is creating anxiety and fear amongst young people, who support a ban on the tools used to create them.

What are nude deepfakes?

With the rise of AI technologies, new threats to children have emerged online, including nude deepfakes. Nude deepfakes are sexual imagery created with AI nudifying tools from images of real people, including children.

Last year, online deepfake sexual content increased by over 400%. Our research also finds that an estimated half a million children (13%) have already experienced a nude deepfake online. This includes coming across one on a website, receiving one from a friend or using a nudifying app themselves.

It is now quick, cheap and easy to generate highly convincing nude images and videos featuring real people with a click of a few buttons. Platforms have been making efforts to remove nudifying tools when they come across them. However, they remain accessible on app stores and common in search results of mainstream search engines.

How nude deepfakes impact children -- especially girls

Deepfake nudes can profoundly impact children including through abuse and harassment. Victims of this abuse can experience PTSD, depression, anxiety and even suicidal thoughts as a result. They may also fear physical violence, if a perpetrator shares their personal data alongside the image.

In fact, most teenagers (55%) feel that having a nude deepfake image created and shared of them would be worse than a real image. When asked why, teenagers cited:

  • feeling the loss of bodily autonomy
  • concern that they might not know it existed, who had made it or why
  • the fear it could cause. This included friends, teachers and parents thinking it was real and seeing them differently. They also worried that the image could completely misrepresent them.

While this issue affects both boys and girls, 99% of nude deepfakes made are of women and girls. Moreover, many nudifying tools do not work on images of boys and men. Our research suggests that nude deepfakes are becoming another tool used to commit violence against women and girls.

What needs to change

Creating, possessing or sharing a nude deepfake image of a child is illegal and classed as child sexual abuse material (CSAM). However, the tools that create them are currently not illegal and children are increasingly coming into contact with them.

Banning nudifying tools

Our research found that 84% of teenagers and 80% of parents support banning nudifying tools for everyone in the UK, including adults.

In the House of Commons, MP Jess Asato urged the Government to listen to the voices of parents and children on this issue by banning nudifying apps.

Ms Asato said: “The rise of nude deepfakes is an increasingly concerning issue – particularly in schools. Research by Internet Matters has shown the harm it is causing children already – and the potential the technology has for even greater damage. By allowing nudifying programmes to remain readily accessible, we are putting our children at risk to be harmed, and even to be criminalised. That is why I have called on the Government to ban nudifying tools and apps.”

Banning nudifying tools was one of the key recommendations in our report, The new face of digital abuse: Children’s experiences of nude deepfakes.

The onus to protect children from deepfake sexual abuse cannot fall on schools and parents. Industry and government must step in. Banning nudifying tools would help protect children from harm online. Furthermore, it would support the Government’s ambitious goal to halve violence against women and girls in the next decade.

Resources to support families

In the meantime, we want to help families be better protected online. Explore the following resources created to support them in navigating this new worrying trend.

Recent posts