Opinion

Opinion: Deepfakes cause real harm and a change in the law is needed

0 671
Writer Helen Mort

Legislators must move faster to keep up with the evolving world of online exploitation and protect its victims. At the moment, UK laws do not cover the making or distribution of deepfake images, leaving millions of (predominantly) women vulnerable. This must change before it damages more lives.

Deepfakes refer to digitally altered images or videos that replace or merge elements of the person in the original photo with somebody else’s image, in a way that appears authentic. While some deepfakes have been used for television pranks and online memes, there are far more serious and sinister forms of this type of digital manipulation.

Deepfakes have been used to influence social and political views, with people pushing their own agendas or discrediting other parties and politicians. Such posts spread more quickly than monitoring can prevent. And although there are differences between fake news, misinformation, deep fakes and disinformation, these types of manipulation can have serious implications. If left unchecked, this trend could lead to a much less democratic society, or a society with deeper political and social divisions. Unfortunately, fake news is 70% more likely to be retweeted than the truth, and in many ways, deepfake images are a form of fake news.

Another concerning form of deepfakes is when photographs are edited and merged with pornographic and violent content. The law is nuanced in that people whose private images have been shared without their consent, for the purpose of intentional harm, revenge porn for example, are covered. However, images taken off of public social media accounts are not included under current laws, especially if the deepfake images are not used as a form of revenge porn. Section 33 of the Criminal Justice and Courts Act 2015 makes it an offence to disclose ‘private sexual photographs or films without the consent of an individual who appears in them and with intent to cause that individual distress.’ This legislation must be modified to encompass these new forms of exploitation.

Danielle Citron, an American Professor of Law and author of Hate Crimes in Cyberspace explains technology “is being weaponised against women by inserting their faces into porn.” She continues, “It is terrifying, embarrassing, demeaning, and silencing. Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult [for women] to stay online, get or keep a job, and feel safe.”

Deeptrace, an Amsterdam based cybersecurity company, found 14,678 deepfake videos online in September 2019, an increase of 100% since December 2018. 96% of these deepfakes were of a pornographic nature and 100% of the deepfake porn images and videos published online were of women.

Deeptrace also found that eight out of ten pornographic websites contain deepfake images or videos and there were nine websites specifically dedicated to this. As technology improves more people will be able to access and create deepfake porn. These websites should take more responsibility and be held to account. Yet profiting from sexual exploitation in this way is not illegal, creating further motivation for engaging in this activity.

Writer, broadcaster and lecturer at Manchester Metropolitan University, Helen Mort, has been a victim of deepfake images and is campaigning for a change in the law. Since 2017, she has fallen victim to the creation of deepfake revenge porn, yet another horrific way that someone can be targeted and victimised online.

Helen discovered her personal social media accounts had been used to access her private pictures. These photographs were being uploaded to pornographic websites without her knowledge, with an invitation for other users to photoshop her face onto explicit and violent sexual images.

Posing as her boyfriend, and using Helen’s first name, the perpetrator encouraged viewers to edit her images. This ordeal went on for three years.

In her online campaign, Helen expressed unity with all victims of sexual exploitation, abuse, or harassment – adding that this is not her fault. Just like women who are blamed for drinking alcohol, wearing ‘provocative’ clothing, or for sending pictures to a partner are not responsible for the actions others take against them. These crimes are always completely the fault of the perpetrator. Victim-blaming must stop. It’s both insulting and dangerous.

Talking through the experience she has endured in her twitter video, Helen said: “We already know that misogyny can be a ‘gateway drug’ to other extremist hate crime, things like racist crime and terrorist offences.” Other extremist hate crimes are punishable under UK law, and rightly so, however, this form of deepfake abuse should also fall into this category due to the potentially destructive nature it has on its victims.

Helen expressed herself through a poem, entitled ‘Deep Fake: A Pornographic Ekphrastic.’ A brilliantly powerful piece, which demonstrates not only her talent but her courage and determination in bringing the perpetrator to justice. Although this experience was clearly difficult, Helen’s moving poem was shared with a sense of empowerment.

The poem expresses some of the content of the deepfake harassment, including a vulgar request for her image to be ‘used hard, abused and broken sexually.’ The poem goes on to describe her original photographs and the way they had been manipulated to create sexual, violent, and graphic images.

Though it was clearly a difficult video to make, Helen revealed she hoped that by doing so, she could help other victims of this abhorrent conduct. Looking directly at the camera, she reads the words: ‘I am not humiliated,’ sending a message of strength to those who wanted her in a position of weakness. Her final line is perhaps the one that shows the ultimate bravery of a woman who refuses to be anyone’s victim, a message of her reclaiming her life: “[Here] is the sound of history, forgetting you.”

Helen shared her story on Twitter and started a petition to have deepfake harrassment discussed in parliament, taking the first steps in making this a punishable crime.

The law needs to be changed quickly, with penalties for those who make fake explicit images and videos, so that others won’t be forced to go through this ordeal.

Support Helen’s petition at www.change.org/.

About the author / 

Kerry Power

Kerry is studying an MA in creative writing at Manchester Metropolitan University. She has a degree in law, is a qualified Primary school teacher and is a mum of two boys.

Leave a reply

Your email address will not be published. Required fields are marked *

More News Stories:

  • Is This Thing On? @ Contact Theatre review – raw, outstanding and heartwarming

    Featured image: Aaron Shaw ‘Is This Thing On?’ is a unique debut show, a product of the creative collaboration between Ellie Campbell, Megan Keaveney (MissMatch), and the So La Flair theatre company. Following its debut appearance at Contact Theatre, the show promises to be a memorable experience for those attending the tour across Wigan, Leeds,…

  • Album review: Seagoth – How to Stay Wide Awake

    Featured image: Seagoth “This album is dedicated to all of the people who can’t take a day off from themselves, to the people who have to face their greatest fears every single day – and to all the pain we feel, may we heal”. – Seagoth on How to Stay Wide Awake. While studying music…

  • Is This Thing On: Feminist theatre with a twist comes to Contact

    Featured image: So La Flair Theatre Ellie Campbell and Megan Keaveney graduated last year from Manchester Theatre School. The pair met at a house party in their first year, where Ellie was standing on a table singing her heart out to Florence and the Machine. Megan locked eyes with her and knew they would be…