Nudification and deepfake porn to be made illegal in England and Wales

Often referred to as revenge porn, this only relates to images or videos that were originally consented to by the victim – meaning there’s a gap in the law when it comes to deepfakes and nudification. And worse, the need for “intent to cause distress” means that perpetrators can simply say they “didn’t mean to hurt anyone” as a defence.

In 2021, Conservative MP for Basingstoke Maria Miller called on the government to criminalise the sharing of deepfakes without consent.

She told GLAMOUR: “For me, it’s a very clear marker of the way in which women are having to deal with very difficult forms of abuse now, abuse which is constructed to be below the radar and above the law.

“The issue of ‘deep fake’ and ‘nudification’ software is just part of that, but it’s the part that I’m trying to focus on to demonstrate the broader need for new laws to stop the images of women being used to humiliate them, degrade them and to frighten them.”

And while it’s important we change the laws regarding explicit fake images, the conversations around these practices must also change, according to the MP for Basingstoke.

It’s deeply disturbing that this technology is being used to degrade and dehumanise women, reducing them simply to body parts with no rights or agency.

“I think most people would find it unacceptable, shocking and disgraceful to think that an image would be taken and ‘nudified’ without the consent of the individual concerned, but I think the visceral impact of that – it does differ by gender, I think. I think the impact on women is far more acute, because of the way we are perhaps treated in society, this has a very profound impact, in a way which I’m not sure is completely understood.

“So it’s only by having these conversations, with men and women in the room, in parliament, that we can really get everybody to understand how devastating this sort of action can be. Because if at the moment, if you looked at the law – you’d be hard-pressed to think that society has a view on this at all. Which really isn’t representing fact. Most people, whether they’re men or women, would think this was a heinous crime. But the law doesn’t reflect that.”

What to do if you think you are a victim of nudification or deepfakes

In response to the rising problem of intimate image abuse, the UK’s Revenge Porn Helpline launched a new platform in December to help those regain control of their images.

The helpline’s manager, Sophie Mortimer, tells GLAMOUR that nudified images so often go under the radar. “Unfortunately, we are all too aware of nudification apps, though we haven’t really had cases on the Helpline – this may be because victims are unlikely to be aware that such images have been created,” she says. “I find it deeply disturbing that this technology is being used to degrade and dehumanise women, reducing them simply to body parts with no rights or agency.

“If someone comes to us who is aware of these images, then we will do our best to support them: most industry platforms, both social media and adult, do not allow deep faked content so we may be able to assist a client on that basis. And if there is some pattern of behaviour that might contribute to harassment we would encourage them to report what is happening to the police.

“While deepfakes are actually exempt from the current legislation, they may form part of a course of conduct that could be harassment or perhaps malicious communications, depending on the context.”

2020 saw an 87% rise in reports to the helpline, and 2021 is already 25% higher again. But the charity’s new tool, StopNCII.org, allows users to actively create unique identifiers for their images, which are then sent to partners such as Facebook and Instagram to be removed. If it meets the criteria of an intimate image, it will be taken down and blocked from any further sharing on all partner platforms.

Leave a Reply

Your email address will not be published. Required fields are marked *