About our nude deepfakes research
This report focuses on the issue of nude deepfakes: sexual imagery generated with AI tools. In almost all cases, these images are non-consensual, and it’s estimated that 98% of all deepfakes in circulation are sexual. Furthermore, 99% of these sexual deepfakes feature women and girls.
Nude deepfakes can impact children in a number of ways:
- Child-on-child sexual abuse and harrassment;
- Adult-perpetrated child sexual abuse material (CSAM); and
- Sextortion.
In the below report, you’ll find a summary of current developments in generative AI (GenAI) which have given rise to deepfakes and ‘nudifying’ tools. The report also provides insights into families’ views and experiences of deepfakes, including nude deepfakes.