Less than half of parents are confident that their children’s school is well prepared if their students become victims of ‘nudification AI’ apps, a survey has found.
The survey found that just 47% were confident or very confident that their child’s school was ready to respond to AI-generated abuse, which may be perpetrated by other students.
The survey, of 57 parents and carers of students in six London schools, also found that:
- Around a third (37%) of parents said they were not familiar with the threat of nudification AI.
- Around 60% of parents said that use of nudification AI should be a criminal offence, with about 20% believing it should be dealt with solely by schools as a disciplinary offence.
- Almost half (46%) said that if their child was the perpetrator then the child should be dealt with by the courts.
- Around half (47%) of parents felt children should be told about nudification AI apps at key stage 2 (age 7-11).
“The emergence of AI-facilitated abuse, particularly child sexual abuse material generated by and for children, presents an unprecedented challenge to traditional safeguarding models,” the researcher, Danielle Hotz, told the annual conference of the British Sociological Association in Manchester today. [Thursday 9 April]
“AI-powered nudification tools enable the synthetic sexualisation of children’s images, generating urgent safeguarding challenges for parents, educators and policymakers.
“Children and young people are now involved in the production, distribution and possession of AI-child sexual abuse material, often with a limited understanding of the legal, ethical or gendered consequences.
“The survey data exposes a concerning lack of parental confidence in schools, with less than half confident in their school’s ability to respond to AI-generated abuse.
“This concern stems directly from a critical policy and knowledge vacuum – established educational and policy frameworks cannot keep pace with the exponential evolution of generative AI, thus failing to adequately address the risks posed.
“This study confirms that AI-facilitated abuse has created an untenable crisis, with the system ill-equipped to handle this contemporary threat.”
She called for technology companies to be held accountable for their role in the proliferation of AI-facilitated abuse, and a comprehensive safeguarding strategy to prioritise the wellbeing of victims.
- Most of the children whose parents were surveyed were aged 7-11. Ms Hotz carried out research for an MSc in Criminology and Criminal Justice degree at Durham University. Nudification apps use AI to turn images of real people into fake nude pictures and videos without their permission.
For more information, please contact:
Tony Trueman
British Sociological Association
Tel: 0044 (0)7964 023392
tony.trueman@britsoc.org.uk
Notes:
The British Sociological Association’s Annual Conference takes place from 8 to 10 April 2026 at the University of Manchester, with more than 700 papers presented. The British Sociological Association’s charitable aim is to promote sociology. The BSA is a company limited by guarantee, registered in England and Wales. Company Number: 3890729. Registered Charity Number 1080235 www.britsoc.co.uk