UK to be the first country where creating deepfakes without consent is now illegal

A major step forward in the fight against deepfake abuse has arrived, and for many survivors, it marks a long awaited moment of progress.

A new law has now come into force that makes it a criminal offence to create non consensual intimate images using artificial intelligence. For campaigners and victims who have spent years pushing for change, this is more than a legal update. It is recognition that the harm caused by deepfake abuse is real, serious, and deserving of protection.

Earlier this week, campaign group Stop Image Based Abuse delivered a petition with more than 73,000 signatures to Downing Street. Their message was clear. While the new law is a vital milestone, there is still work to do to ensure victims can quickly remove abusive content and access meaningful support. They are calling for civil routes to justice such as takedown orders that would force platforms and devices to remove harmful imagery faster, along with better education and funding for specialist services like the Revenge Porn Helpline.

For Jodie, a survivor who uses a pseudonym, the day the law took effect felt momentous.

She discovered in 2021 that her images had been turned into deepfake pornography and uploaded online. Alongside 15 other women, she helped bring the perpetrator to justice. He was convicted and sentenced to prison. At the time, however, there was no specific law that fully captured what had been done to her.

“I had a really difficult route to getting justice because there simply wasn’t a law that reflected the harm I experienced,” she said.

The new offence was added as an amendment to the Data Use and Access Act 2025 and has now officially come into force. For many survivors, it represents validation after years of campaigning to have deepfake abuse taken seriously.

The momentum for change has also opened wider conversations about education, prevention, and support. Campaigners want improved relationships and sex education to help young people understand consent and digital responsibility. They also want long term funding for services that help victims recover and rebuild.

Madelaine Thomas, founder of tech forensics company Image Angel, says the day is emotional for many who have lived through image abuse. She has spoken openly about her own experience of having intimate images shared without consent for years and the toll it took on her mental health.

She believes the new law is a crucial foundation, while also highlighting the need for broader protections so that everyone affected by image abuse, including sex workers, can access help that reflects the seriousness of the harm they face.

What unites survivors, campaigners, and advocates is a sense that the tide is turning. Deepfake technology may be advancing rapidly, but so is awareness, accountability, and the will to protect people from misuse.

For those who have fought for this change, the law coming into effect is not the end of the journey. It is a powerful beginning.

Photo by Elnur

read more

Scroll to Top