Search

The new face of digital abuse

Children’s experiences of nude deepfakes

The rise of generative AI tools has significantly increased the ease of producing realistic sexual deepfakes with nude deepfakes making up around 98% of all deepfakes.

99% of nude deepfakes feature women and girls. Explore our report below that examines the rise of deepfakes in classrooms and our recommendations on how to address this.

A teen girl looks at her smartphone.

About our nude deepfakes research

Young people grapple with a great deal of social and emotional change in their teenage years: They are working out who they are, how to identify and manage their emotions, and developing a sense of how they fit into wider peer groups. Given the importance of technology in modern life, some of these changes play out in online spaces.

So, if young people are to be supported through the ups and downs of adolescence, it is critical to understand how online platforms influence their development.

Key findings

The majority of families have little to no understanding of deepfakes.

Almost two-thirds of children and almost half of all parents say they don't know or understand the term 'deepfake'.

Nudifying tools are used to sexually abuse children.

While most nudify sites prohibit the production of deepfake sexual images featuring children, these guardrails are often easy to circumvent.

Boys and vulnerable children are more likely to have engaged with a nude deepfake.

Teen boys are twice as likely as teen girls to report experience with a nude deepfake. 25% of vulnerable children say they have experience with a nude deepfake compared to 11% of non-vulnerable children.

The volume of deepfakes has grown rapidly online.

Evidence suggests that the majority of deepfakes are used for harmful purposes. Additionally, it is difficult to establish the true scale of deepfakes circulating online.

Teenagers see nude deepfake abuse as worse than sexual abuse featuring real images.

The reasons that teens gave included a lack of autonomy over or awareness of the image, anonymity of the perpetrator, how the image might be manipulated and fears of people thinking the image is real.

Families agree that Government and Industry need to do more to tackle nude deepfakes.

The majority of teens and parents feel that nudifying tools should be banned for everyone in the UK, including adults. Families also agree that more education is needed on the topic.

Nudify tools are widely available online.

These AI models which strip the clothes from images of real people, including children, have become more common with the development of GenAI.

A significant number of children have experience with a nude deepfake.

13% of teens say they have some sort of experience with nude deepfakes, which equates to around 4 children in a class of 30 or almost half a million UK teens.

Legislation and Industry action is needed to protect children from deepfake sexual abuse.

Parents and schools cannot and should not be expected to protect children alone. We are calling on the Government to ban nudify tools as a priority in this Parliament.

Read the full research into nude deepfakes

Supporting resources