Nudification and deepfake porn to be made illegal in England and Wales why has it taken so long?

Imagine seeing a naked image of yourself online – it was never taken, it never happened, but it looks scarily real. The act of 'nudification' – taking an image and stripping the subject nude using an artificial intelligence software – is just the latest horrifying internet trend that is gaining a worrying amount of traction.

It's another string to the decidedly creepy bow of 'deepfaking' – which is the act of artificially superimposing someone else's face onto the body of another – and is often used to create non-consensual pornography.

In recent years, with the launch of certain apps, nudification technology has become much more accessible and sophisticated. It becomes even more complicated since it's hard to track down whose images are being used and when, and by who – but we know it's happening.

Numbers show that one website, in particular, received 38 million hits in the first seven months of 2021 alone – GLAMOUR won't draw attention to it by sharing its name. But disturbingly, its bio reads: "Undress any girls with any dress. Superior than Deepnude. The threat to girls."

It absolutely is a threat to girls. Nudification is the latest form of image-based abuse – and usually, it's women who haven't consented who are the targets, with devastating effects.

Alana* was one victim of fake explicit images, which were widely circulated online without her consent. "It has the power to ruin your life and is an absolute nightmare, and it is such a level of violation that is exemplified because you are violated not only by the perpetrator, but you also feel violated because society doesn't recognise your harm," she said. "It is a very isolating, degrading and demeaning experience – it's life-ruining."

Shockingly, these practices fall within a legal grey area in the UK, but that could be about to change. The government has confirmed that a proposed amendment to the Online Safety Bill would criminalise the sharing of manipulated pornographic images without consent. 

In previous cases (via BBC News), men who admitted to sharing manipulated images of women without their consent escaped prosecution because they said they did not intend to cause harm. The new proposals would remove the requirement of proof to cause harm, making it easier for the perpetrators to be prosecuted. 

Justice Secretary Dominic Raab announced the measures, saying it would “Give women and girls the confidence that the justice system is on their side and will really come down like a ton of bricks on those who abuse or intimidate them.”

Culture Secretary Michelle Donelan added: “With these latest additions to the bill, our laws will go even further to shield women and children, who are disproportionately affected, from this horrendous abuse once and for all.”

What are the current laws regarding image-based abuse?

The conversation around online sexual abuse has been a long one, and the first move towards making it a punishable offence came with the introduction of 'image-based sexual abuse' laws in 2015, which made it illegal to disclose "private sexual photographs and films with intent to cause distress."

Often referred to as revenge porn, this only relates to images or videos that were originally consented to by the victim – meaning there's a gap in the law when it comes to deepfakes and nudification. And worse, the need for "intent to cause distress" means that perpetrators can simply say they "didn't mean to hurt anyone" as a defence.

In 2021, Conservative MP for Basingstoke Maria Miller called on the government to criminalise the sharing of deepfakes without consent.

She told GLAMOUR: "For me, it's a very clear marker of the way in which women are having to deal with very difficult forms of abuse now, abuse which is constructed to be below the radar and above the law. 

"The issue of 'deep fake' and 'nudification' software is just part of that, but it's the part that I'm trying to focus on to demonstrate the broader need for new laws to stop the images of women being used to humiliate them, degrade them and to frighten them."

And while it's important we change the laws regarding explicit fake images, the conversations around these practices must also change, according to the MP for Basingstoke.

It's deeply disturbing that this technology is being used to degrade and dehumanise women, reducing them simply to body parts with no rights or agency.

"I think most people would find it unacceptable, shocking and disgraceful to think that an image would be taken and 'nudified' without the consent of the individual concerned, but I think the visceral impact of that – it does differ by gender, I think. I think the impact on women is far more acute, because of the way we are perhaps treated in society, this has a very profound impact, in a way which I'm not sure is completely understood. 

"So it's only by having these conversations, with men and women in the room, in parliament, that we can really get everybody to understand how devastating this sort of action can be. Because if at the moment, if you looked at the law – you'd be hard-pressed to think that society has a view on this at all. Which really isn't representing fact. Most people, whether they're men or women, would think this was a heinous crime. But the law doesn't reflect that."

What to do if you think you are a victim of nudification or deepfakes

In response to the rising problem of intimate image abuse, the UK's Revenge Porn Helpline launched a new platform in December to help those regain control of their images. 

The helpline's manager, Sophie Mortimer, tells GLAMOUR that nudified images so often go under the radar. “Unfortunately, we are all too aware of nudification apps, though we haven’t really had cases on the Helpline – this may be because victims are unlikely to be aware that such images have been created," she says. "I find it deeply disturbing that this technology is being used to degrade and dehumanise women, reducing them simply to body parts with no rights or agency.

“If someone comes to us who is aware of these images, then we will do our best to support them: most industry platforms, both social media and adult, do not allow deep faked content so we may be able to assist a client on that basis. And if there is some pattern of behaviour that might contribute to harassment we would encourage them to report what is happening to the police. 

"While deepfakes are actually exempt from the current legislation, they may form part of a course of conduct that could be harassment or perhaps malicious communications, depending on the context.”

2020 saw an 87% rise in reports to the helpline, and 2021 is already 25% higher again. But the charity's new tool, StopNCII.org, allows users to actively create unique identifiers for their images, which are then sent to partners such as Facebook and Instagram to be removed. If it meets the criteria of an intimate image, it will be taken down and blocked from any further sharing on all partner platforms.

*Names have been changed for privacy reasons.

ncG1vNJzZmivp6x7qLjApqauqp2WtKLGyKecZ5ufY8Kse8Crq6KbnJp8r8HDop2im5GptrC6jKKlraGdlsGmecimmKCdXZavtr%2FE