Creating, possessing or sharing a nude deepfake image of a child is illegal and classed as child sexual abuse material (CSAM). However, the tools that create them are currently not illegal and children are increasingly coming into contact with them.
Banning nudifying tools
Our research found that 84% of teenagers and 80% of parents support banning nudifying tools for everyone in the UK, including adults.
In the House of Commons, MP Jess Asato urged the Government to listen to the voices of parents and children on this issue by banning nudifying apps.
Ms Asato said: “The rise of nude deepfakes is an increasingly concerning issue – particularly in schools. Research by Internet Matters has shown the harm it is causing children already – and the potential the technology has for even greater damage. By allowing nudifying programmes to remain readily accessible, we are putting our children at risk to be harmed, and even to be criminalised. That is why I have called on the Government to ban nudifying tools and apps.”
Banning nudifying tools was one of the key recommendations in our report, The new face of digital abuse: Children’s experiences of nude deepfakes.
The onus to protect children from deepfake sexual abuse cannot fall on schools and parents. Industry and government must step in. Banning nudifying tools would help protect children from harm online. Furthermore, it would support the Government’s ambitious goal to halve violence against women and girls in the next decade.