Nudify apps: When AI blurs the line between real and fake
28 April 2026
Artificial intelligence has made it possible to create highly realistic images in seconds, including fake nude or sexualised images generated from ordinary photos.
These tools are no longer niche. They’re being shared through social media, group chats and links between friends. For many young people, the technology itself isn’t the focus, it’s how it’s used socially.
So-called “nudify” apps use artificial intelligence to create fake nude or sexualised images from ordinary photos. While the technology itself is new, the social dynamics around it are familiar: curiosity, humour, peer pressure, and sometimes, harm.
For some young people, it starts as something to try or joke about. A link gets shared or someone uploads a photo “just to see what happens”. It can spread quickly for a few reasons; it might feel new and shocking, there might be some peer pressure at play, or there could be a lack of understanding of the fact that there's a real person behind the image.
These images can look convincing, even when they’re fake. That can make it harder for young people to understand what’s real, what’s acceptable, and where the boundaries are.
Where it becomes harmful
Even though the images are fake, the impact is real., and situations can escalate when:
- someone’s image is used without consent
- images are shared beyond the original group
- the image is used to embarrass, pressure, or threaten
This is unfamiliar territory for many adults. It can feel confronting, technical, or difficult to even name.
Rather than jumping straight to rules, it can help to open a conversation about how this kind of technology is being used. You don't need to be an expert in tech - just open up a space for curiosity.
You might ask about:
- What are they seeing or hearing about AI tools?
- How do they think people are using them?
- What would feel okay (or not okay) if it involved them or someone they know?
These conversations often lead naturally into discussions about consent, respect, and the impact of sharing images, whether they’re real or not.
The technology may be new, but the underlying issues aren’t. Helping young people navigate respect, consent, and boundaries remains the most important foundation for being safe and responsible online. For young people, knowing they can talk to a trusted adult without getting into trouble is key. For parents, staying curious and informed about how these tools are showing up in their child’s world is often the most powerful first step.
Check out the 'Talking about Deepfakes and Synthetic Media' guide for some suggestions on how to approach the topic calmly and confidently.
