Skip to content
Date Published
February 06, 2026

Today (6th February 2026), following months of campaigning by survivor Jodie*, End Violence Against Women Coalition, #NotYourPorn, Professor Clare McGlynn and Glamour UK, a new law comes into force making it a criminal offence to create or ask someone to create a sexually explicit deepfake of someone without their consent, or to solicit a deepfake. This is a welcome and long-overdue positive step for survivors.

The new offence was announced in January 2025 and finally comes into force under the Data Use and Access Act 2025 after the Grok scandal, which saw 3 million sexualised images produced. While it should not have taken this long to enact the new law, and many women and girls have been harmed in the delay, criminalising the creation of deepfakes is an important step towards tackling this abuse.

But it is not ‘job done’: without enforcement of the law, adequate regulation of tech companies and civil routes to justice, women and girls will remain exposed to abuse and survivors left to fend for themselves.

Image-based abuse is a major human rights issue

As access to AI and nudification apps increases, so does the proliferation of image-based abuse, such as non-consensual deepfakes. 96% of deepfakes are sexually explicit, and 99% of those depict women.

Around 3 million women and children were victimised by Grok alone, showing the scale and speed at which this abuse is spreading.

Deepfake abuse causes profound harm to victims, including significant psychological trauma, impact on employment opportunities and interpersonal relationships, silencing of women’s voices online and withdrawal from public life. The threat of this abuse is affecting women’s freedoms, including our freedom of speech.

Jodie*, survivor of deepfake abuse who launched the campaign, said:

“This moment belongs to survivors and campaigners in the UK and beyond who have spent years pushing for the most basic right to have control over our own images. Women and girls should never have had to fight this hard to be protected from a form of abuse that is so clearly devastating and intrinsically gendered.

While I am relieved that the law is finally being brought into force, it is impossible to ignore that it should have happened last year, when it received Royal Assent. That it has taken the Grok scandal, and the creation of yet more victims, to force action is deeply disappointing. This delay has caused real, avoidable harm.

This campaign has always been about more than one platform or one moment in the news cycle. It is about recognising image-based abuse as a serious form of sexual violence, keeping pace with rapidly evolving technology, and ensuring survivors have meaningful routes to justice and support, and that those routes are properly funded. Today [Friday] marks how far we’ve come, but also how much further we still have to go. The work does not stop here.”

73,000 members of the public demand more action on deepfake abuse

Today, Jodie and her fellow campaigners delivered a 73k-strong petition to No 10 Downing Street calling for the government to go further than criminal offences alone and focus on regulating Big Tech, which is promoting and profiting from abuse of women and girls.

This abuse is happening at an industrial scale and until tech platforms are forced to prioritise women’s safety over profit, the problem will continue, no matter how many criminal offences exist.

What needs to happen next

Our campaign is calling for the following measures to stop image-based abuse:

  1. Rapid image takedown routes
  • Survivors need 48-hour takedown orders so abusive images can be removed quickly from websites, platforms, and perpetrators’ devices
  • Many women do not want to pursue a criminal case but desperately want the images gone. So stronger civil law options to act against perpetrators and platforms
  • Police should also be granted power to issue 48 hour take down notices – as per advertising knives
  • Blocking non compliant websites that are domiciled outside of the UK and not subject to the same regulations – as they do with CSAM and terrorist content 
  1. Civil routes to justice
  • Stronger civil law options to act against perpetrators and platforms
  • Redress that does not rely solely on the criminal justice system, which routinely fails survivors of all forms of violence against women and girls. 
  • Legal measures that enable survivors to obtain takedown orders and other civil measures are already in place in countries like Canada and actually in the US. 
  1. A wider, context-based definition of intimate images 
  • What counts as ‘intimate’ isn’t universal. For some women, particularly from religious or minoritised communities, an image showing them without a hijab can be deeply violating and dangerous. If the law only reflects a white, Western idea of intimacy, it will leave many women unprotected.
  • This must recognise cultural, religious and social meaning, not just physical exposure. A framework that centres harm and coercion, rather than relying on a narrow checklist of body parts.
  1. Tech accountability
  • Tech companies are profiting from the abuse of women and girls. There must be strong penalties for companies that fail to prevent and respond to abuse. 
  • We need a mandatory Violence Against Women and Girls Code of Practice for platforms to follow, with consequences for non-compliance. Current VAWG guidance for tech platforms is voluntary, meaning compliance is patchy and down to good will.
  • We need a real ban of nudification tools, not loopholes that ignore how these technologies actually work. This must be based on regulation, not just criminalisation.
  1. We need an Online Safety Commission for online harms that:
  • Provides a clear reporting route for individual survivors, supports them through imagery takedown, carries out evidence-gathering and escalation, compiles evidence of emerging and systemic harms including new uses of technology, and feeds that evidence directly into investigations, enforcement and regulatory action.
  • This would bridge the gap between individual harm and systemic accountability  ensuring women don’t have to sacrifice their privacy, safety or dignity just to be taken seriously. 
  • At the moment, there is no clear place for a woman to go to report deepfake or intimate image abuse and trigger action. Ofcom regulates platforms at a systems level, but it does not deal with individual complaints. That means survivors are left reliant on under-funded specialist charities, or hoping their case becomes part of a critical mass that attracts media attention and forces a government response. That is not a functioning system of accountability.
  1. Education in schools
  • Relationships and Sex Education must reflect the digital reality young people are growing up in. We welcome the government’s recent commitment to this in its VAWG strategy, and keenly await details of what this will look like in practice.
  • There must be clear information that creating or sharing deepfake sexual images is abuse and carries consequences.
  1. Enforcement of the offence
  • We need to now see a focus on enforcement of the new offence, so that the police are responding appropriately. 
  • We also need the government to go much further than criminal justice alone. We need proper regulation and support, and more options for redress. That’s harder than just adopting a criminal offence, but it is what is needed. 3 million women were victimised by Grok – this is a massive issue that we need to take seriously. 
Clare McGlynn, Professor of Law at the University of Durham, said:

“Every day women face the threat of being deepfaked, a devastating violation that can shatter lives. This new law is a vital step towards ending image based abuse. But it’s only the start.

We need comprehensive reforms that give women rights to get material taken down and deleted. We need a regulator that will take proactive and robust action against deep fake and nudify websites and apps. 

And we need these actions urgently. Legal changes do not need to take years. We just need to prioritise acting against this industrial scale abuse of women and girls.

This new law sends a clear message that deepfake sexual abuse is wrong and crimimal. This provides the foundation for action, we now need to build on its foundations by granting civil rights to women and girls to get imagery removed and deleted. We also need strong action to ensure that platforms are held to account.”

Rebecca Hitchen, Head of Policy & Campaigns at the End Violence Against Women Coalition (EVAW), said:

“It is thanks to the determination of victims and survivors like Jodie that this law comes into force today. Far too many women and girls have paid the price for a global failure to properly regulate online platforms which will always choose profit over the safety and wellbeing of users. 

This is a momentous step in the right direction but the government and Ofcom must go further. This means more support for survivors, more routes for take downs and removals, and robust and effective regulation.”

Elena Michael, co-founder and Director of #NotYourPorn, said:

“In the noise about this law being a Government success story, we must champion the stories of survivors behind the scenes, like Jodie, who didn’t give up and fought every day to be heard. This law has been crafted from Jodie’s experience and we are indebted to her for fighting for the rights of women and girls despite the enormous personal cost. 

Although we are pleased the law is coming into force, it shouldn’t have taken a year or public outrage to bring it into fruition. This is a sobering reminder of how survivors lead change and that listening to survivors early on is not optional, it is essential to preventing harm. This is not the time to become complacent with the wins, there is much left to do and survivors like Jodie are at the heart of the solutions.”

Lucy Morgan, Purpose and Digital Director at Glamour UK, said:

“We are heartened and honoured to be part of this survivor-led movement to eradicate image-based abuse. 

Today is a testament to all the survivors and victims of image-based abuse who have spent the last few years tirelessly campaigning and sharing their stories to make this change. We stand with survivors and reiterate that Glamour UK will always be a safe space for their stories.”

What the new law does
  • Criminalises the creation and solicitation – ie asking someone to create of non-consensual sexually explicit deepfake images
  • Complements existing offences around sharing intimate images
  • Sends an important signal that this behaviour is abuse, not entertainment
What the law does not do
  • It does not automatically ensure abusive images are removed from the internet
  • It does not hold tech companies meaningfully accountable for hosting or enabling this abuse
  • It does not provide easy, survivor-led routes to redress outside the criminal justice system
  • It would not necessarily have covered the basic “bikini” request that went viral on Grok.  The definition of “intimate image” mirrors existing law. It clearly covers sexualised imagery and underwear. A straightforward bikini image may not be covered, but highly sexualised or explicit depictions — for example transparent clothing or sexual acts — are likely to fall within scope. This highlights why definitions may need to be expanded over time.”
  • Right now, if a woman is targeted by deepfake abuse, her options can feel very limited –  there’s incredible charities like revenge porn helpline but specialist support services are overstretched and Ofcom doesn’t handle individual cases. Unless your abuse becomes a media story, you’re on your own and that’s not acceptable.

 

Date Published
February 06, 2026
EXIT THE WEBSITE
Back To Top