Human Rights
3
 min read

AI nudification, dishonour abuse, and why children must be protected now

How AI nudification is turning everyday photos into a new form of sexual abuse against children Two strong alternatives, depending on outlet tone: How AI nudification apps are normalising sexual abuse and putting children at risk

Written by

Aneeta Prem

Published on

January 6, 2026

A New Kind of Abuse, Hiding in Plain Sight

AI nudification, dishonour abuse, and why children must be protected now

By Aneeta Prem MBE

AI-powered “nudification” tools are creating a new form of sexual abuse that many adults still do not fully understand, but children are already living with.

These tools use artificial intelligence to turn an ordinary photo into a fake nude image. What began on the dark web and fringe forums is now packaged as mainstream apps and websites, often described as “fun”, “pranks”, or even “art”.

For children, there is no meaningful difference between a real nude image and a fake one. Both can destroy reputations, damage mental health, and change the course of a young life.

This is not a future risk. It is happening now, at scale, and largely in plain sight.

What is AI nudification?

AI nudification is when computer software is used to make a fake nude image of a real person from a normal photo.

The person in the photo has not agreed to this.
The image is not real, but it can look real enough to cause serious harm.

For children, the impact is the same as if the image were real. Fake nude images are used to shame, threaten, bully, and blackmail.

Under UK law, sexual images of children – whether real or AI-generated – are treated as child sexual abuse material.

A new kind of abuse the law was not built for

For decades, child protection focused on preventing the creation and sharing of real indecent images of children. AI nudification changes the rules entirely.

A single innocent photo – a school trip, a sports day, a selfie – can be copied, uploaded, and transformed into sexualised material that looks real enough to be believed. A child does not need to take or send an intimate image for abuse to occur. Their body can be fabricated from pixels that never existed.

Child protection law was never designed for a world where a child’s body can be invented, weaponised, and endlessly reproduced by software. That gap is only now being addressed.

Girls as targets, boys as collateral

The pattern is clear and deeply concerning.

AI sexual imagery overwhelmingly targets girls and women. At the same time, teenage boys are more likely to be the ones creating, sharing, or forwarding these images. In some UK classrooms, several pupils have already encountered nudified or deepfake images.

This is not harmless experimentation. It is the digital continuation of school-yard misogyny, amplified by code.

The harm to boys is real too. Boys are drawn into an online culture where cruelty earns attention, empathy is mocked, and respect for girls is treated as weakness. When boys are told that “everyone does it”, they are being recruited into an economy that normalises sexual humiliation as entertainment.

Later, those same boys may face serious consequences: criminal records, permanent digital footprints, and the lifelong conflict of knowing they caused harm before they fully understood it.

“Not in my name” must be more than a slogan. It must be a refusal by boys and young men to be used as cover for an industry that profits from the abuse of girls while pretending it is simply offering tools.

The emotional toll: shame, fear, and silence

For children whose images are nudified, the impact is often invisible but profound.

Many describe the fear of knowing this technology exists as being as frightening as an actual image being shared. Once they understand how easily it can be done, every phone in the room feels like a threat.

Girls talk about avoiding school, stopping activities, or changing how they dress, simply to reduce the risk of being photographed.

Research from the Children’s Commissioner for England shows that many teenagers believe it would be worse to have a fake nude image created of them than a real one. A real image can sometimes be explained as a mistake or a betrayal of trust. An AI image sends a different message:

“Your body is not yours. Anyone can do this to you.”

Shame keeps many children silent. They fear adults will believe the image is real or blame them for taking the original photo, however innocent it was. For some, withdrawing from school, friendships, and online spaces feels safer than being visible.

Dishonour abuse has gone digital

I use the term dishonour abuse deliberately.

Dishonour abuse describes harm where shame is used as a weapon to control, silence, or punish women and girls. There is nothing honourable about this abuse. The dishonour belongs entirely to the abuser.

AI nudification follows the same pattern.

A girl is harmed not because of what she has done, but because someone wants to humiliate her, threaten her reputation, or frighten her into silence. The image is the tool. Shame is the weapon.

This is why nudification apps are not just a technology problem. They are a modern form of violence against women and girls and a clear example of dishonour abuse online.

You can read more about dishonour abuse and safeguarding here:
👉 https://www.freedomcharity.org.uk

Sextortion at scale: when abuse becomes automated

Nudification is not limited to playground cruelty. It is now used by organised abusers and extortion networks.

Criminals need only a handful of non-sexual images from a child’s social media or gaming accounts to create convincing sexual imagery. That imagery is then used to blackmail the child into sending real images, paying money, or complying with escalating demands.

The NSPCC has warned that AI is increasingly being used to enable sextortion and coercion, removing the need for an offender to persuade a child to send an initial image.

The speed, scale, and anonymity of AI make this form of abuse cheap, fast, and extremely hard to trace.

What the law says in the UK

The law is catching up, but it has been slow.

  • Sexual images of children, including AI-generated images, are already illegal under UK law.
  • Sharing or threatening to share intimate images without permission is a criminal offence under the Online Safety Act.
  • New UK legislation is being introduced to make it illegal to create and supply AI nudification tools, not just to use them.

This matters because it targets the companies and developers who profit from this abuse, not only the individuals who commit it.

UK Government guidance on online safety can be found here:
👉 https://www.gov.uk/government/collections/online-safety-bill

If this happens to you

If someone makes a fake nude image of you:

  • You have done nothing wrong
  • The image being fake does not make it less serious
  • The shame is not yours to carry

If you are under 18, you can use Report Remove to help get images taken down:
👉 https://www.iwf.org.uk/report-remove

You can also talk to a trusted adult, teacher, or contact Childline:
👉 https://www.childline.org.uk

You deserve protection, support, and to be believed.

“Not in my name”: changing the story

Stopping nudification is not only about banning apps. It is about changing the culture that makes them marketable.

Children need clear, honest conversations about AI-based sexual abuse. Consent does not disappear online. If you would not strip someone’s clothes off in real life, you do not get to do it with code.

This is where boys as allies matter. When boys refuse to use these tools, challenge friends who do, and say “this does not represent me”, the social cost rises faster than any legal penalty.

For girls, the message must be equally clear: if an image is created of you without permission, real or fake, the blame lies entirely with the person who did it.

Children should never have to choose between having a digital life and having a safe one.

Frequently asked questions

Is AI nudification illegal in the UK?
Sexual images of children, including AI-generated images, are already illegal. New laws are being introduced to ban the creation and supply of nudification apps altogether.

Why are girls targeted more than boys?
Most AI sexual images are made of women and girls. This reflects wider patterns of hatred of women and the use of shame to control girls’ behaviour.

What does dishonour abuse mean online?
It means using shame and humiliation to control or silence someone. Online, this often happens through images, threats, and public exposure.

Final word

AI nudification is a test of whether society is willing to treat online sexual abuse of children with the same seriousness as abuse that happens offline.

It is also a test of whether boys and young men will allow their identities to be shaped by an industry that profits from the suffering of girls, or whether they will stand together and say, clearly and confidently:

Not in my name.

Sources and support

Contact me

Get in touch

I'd love to hear from you.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Latest posts

News Articles

Tips, guides, useful information, and the latest news.

Human Rights
5
 min read

Gambia FGM court case

The Gambia’s Supreme Court is hearing a case that could weaken the ban on FGM. This clear explainer shows why it matters for child protection worldwide.

Read post
Human Rights
5
 min read

Stalking is not love: the law, the danger, and dishonour abuse

When families watch, follow and report: the hidden stalking dynamic behind forced marriage and dishonour abuse

Read post
Human Rights
3
 min read

AI nudification, dishonour abuse, and why children must be protected now

How AI nudification is turning everyday photos into a new form of sexual abuse against children Two strong alternatives, depending on outlet tone: How AI nudification apps are normalising sexual abuse and putting children at risk

Read post
Human Rights
6
 min read

Capacity, Consent and Dishonour Abuse

Capacity, Consent and Dishonour Abuse: Why “no capacity means no consent” is the safeguard the world is still failing to apply

Read post
6
 min read

Female genital mutilation in the UK

Female genital mutilation in the UK: what we don’t say out loud. Female genital mutilation is not “a problem elsewhere”. It is child abuse happening in the UK and worldwide, and girls need us to name it.

Read post
Human Rights
3
 min read

Balochistan Child Marriages Restraint Act 2025

Balochistan Child Marriages Restraint Act 2025 Provincial law: minimum marriage age 18. Cognisable, non-bailable, non-compoundable offences. Forced marriage is child abuse. The shame lies with the perpetrator, not the child.

Read post