Deepfakes: What They Are and Your Rights
A clear explanation of AI-generated fake images and videos, how they are used to harm people, and what legal protections exist for you in the UK.
Creating or sharing non-consensual intimate deepfakes is illegal in the UK. If this happens to you, you have rights, and you do not have to deal with it alone.
Deepfakes are AI-generated images or videos that realistically depict someone doing or saying something they did not actually do or say. The technology has improved dramatically and is now accessible to almost anyone. For young people, the most serious concern is the use of this technology to create non-consensual intimate images — fake explicit images of real people. This is a growing harm, and it is important to know both what the law says and what you can do if it happens to you.
What deepfakes are and how they are made
Deepfakes use artificial intelligence to manipulate or generate realistic images and video. Some are created by 'face-swapping' tools that place your face onto another person's body. Others generate entirely new images based on a description. All that is required to create a fake image of someone is usually a collection of photos of them — photos that may already be publicly available on social media. You do not need to have shared any intimate images for someone to create a fake one of you.
How deepfakes are used to harm people
Non-consensual intimate deepfakes (sometimes called NCII deepfakes) are used to humiliate, control, or extort people. An abuser may threaten to share a fake explicit image unless the victim complies with their demands — this is a form of sextortion. Fake images may be shared in peer groups to humiliate someone. They may be posted publicly. Even the threat of their creation or sharing can be used as a form of abuse and coercive control.
Your legal rights in the UK
UK law has been updated to address this harm. Under the Criminal Justice Act 2003 as amended, sharing a non-consensual intimate image — including AI-generated fake images — without consent is a criminal offence. The Online Safety Act 2023 strengthened obligations on platforms to remove such content quickly. As of 2024, it is also an offence to create a non-consensual intimate deepfake image of an adult without consent, even if it is never shared. These laws are relatively new; if you are unsure of your position, the Revenge Porn Helpline (0345 6000 459) can advise.
What to do if this happens to you
Do not send money or comply with demands — this rarely makes the situation better. Screenshot and document any threats or demands as evidence. Report to the platform immediately: most major platforms have specific processes for removing intimate images, including deepfakes. Contact the Revenge Porn Helpline (0345 6000 459 or revengepornhelpline.org.uk) — they can help get images removed and advise on next steps. Report to the police. Tell a trusted adult. This is not your fault, regardless of what images of you are publicly available online.
If anything in this guide has made you think about your own situation and you need to talk to someone, Childline is free and confidential on 0800 1111.
Related Resources
Was this page helpful?
Last reviewed: 2026-04-01