top of page
Search

International Women's Day 2024 - Generated harms: the rise of exploitative AI

Updated: Mar 7



Doughty Street Chambers is a set of internationally renowned barristers who primarily focus on criminal justice, public law, human rights and civil liberties work. So, I was absolutely thrilled when they invited me to speak at their annual International Women's Day event on 2 March, 2024. I spoke on a panel entitled Generated harms: the rise of exploitative AI, together with DSC barristers Fiona Murphy KC and Harriet Johnson, technical specialist Kavita Kalsi, and the CEO of End Violence of Women, Andrea Simon. As the first speaker, I provided a high-level overview of the harmful ways that deepfakes can be used against women and girls: a transcript of my remarks is below.


I’m a media and entertainment lawyer - and I've been researching deepfakes for the last six years or so. My journey into this field was sparked by a disturbing yet insightful observation made by one of the biggest stars in Hollywood: Scarlett Johansson. In 2018 she told The Washington Post -

 

"nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired."

 

When asked how she dealt with this, she explained:

 

"There are basically no rules on the internet. I think it’s a useless pursuit, legally, because the internet is a vast wormhole of darkness that eats itself."

 

Thanks, Elon...


So after reading her interview, I set out to test Scarlett's claim: is the pursuit of legal remedies against harmful AI truly useless?

 

What Scarlet is describing here are those “face swapping videos” called deepfakes. Deepfakes are created with a sophisticated form of AI called deep learning, and purport to show someone doing something they did not do, in a manner so realistic that it’s impossible or nearly impossible to tell that it’s a fake.

 

Deepfake technology dates to 2014 but in 2017, the algorithm was released online in a Reddit forum, for the sole purpose of sexually exploiting women.

 

We often hear these videos referred to as “deepfake porn”, but that term is misleading and harmful - and here's why.

 

Pornography involves the consensual participation of adults. Creating sexual deepfakes without consent of the person depicted should be recognised for what it is: image-based sexual abuse, also called intimate image abuse. It's a violation that extends beyond an invasion of privacy; it assaults the individual's autonomy and dignity.

 

As mentioned earlier [by another speaker, in the opening session] deepfakes can be used for political and electoral manipulation, like the recent one of London Mayor Sadiq Khan. Can anyone take a guess as to what per cent of deepfakes are political? Most? 75% ? About 50% ?


[An audience member guessed 50%]

 

It's actually fewer than 10%.


Current research suggests that more than 90% of non-consensual, unwanted deepfakes are sexual in nature. And nearly 100% of those depict women or girls. Like many of you here today I’m sure, I believe the use of AI in this way is a human rights violation.  



That's me, Kelsey Farish, on the far left - together with DSC barrister Harriet Johnson, the CEO of End Violence of Women, Andrea Simon, and technical specialist Kavita Kalsi (L to R).

 

So, what could a media and entertainment lawyer like myself have to add to the debate? 


Sometimes I fear that compared to the other panellists, I'm more Buzzfeed to their Bellingcat!

 

Alas. Deepfakes have evolved over the last decade - they can be used in consensual, creative, and educational ways. 


I must therefore emphasise that not all deepfakes are “bad” – film studios, satirists, medical researchers, artists, and museums have used the technology for admirable purposes. Come find me later or drop me an email, and I’ll gladly tell you all about this.

 

I think most of us would accept that any proposed legislation on AI should also bear the concepts of (1) intent and (2) freedom of expression in mind. We have to be nuanced in our approach.

 

Let’s go back to Scarlett Johanssen – And for a moment, let’s forget that she’s a fantastic actor who appears in blockbusters or wears stunning dresses on the red carpet. Let’s think of her as an ordinary woman – Wearing sweatpants on the couch, out for dinner with her friends, or grocery shopping with her children. Let’s consider how the law – how regulations, how social media terms and conditions, how contracts - can protect her from harmful uses of AI.

 

Publicity laws are just one such element of protection. This bundle of laws – to include those on defamation, privacy, and image rights - determine what can and cannot be said or shown about person – and they are not just for movie stars. 


They’re for anyone who happens to have a photo of themselves on linkedin, or Instagram, or a school website. In the real world, they’re for anyone who has a social life.

 

Look - AI is here, and love it or loathe it, it’s probably here to stay.

 

One of my most sacred ambitions is to help performers fully understand how their likenesses may be used or altered; and that includes modifications using AI tools. And through the contracts I draft and negotiate, I hope to protect the integrity of their performances, as well as their autonomy and dignity as performers.

 

It’s obvious that much of my work is built upon the advocacy and dedication of lawyers who specialise in human rights law, as well as technical experts and policymakers.

 

Looking at the panel here today – it may seem like an odd mix of specialisms. But the dialogue between our fields is crucial and strengthens our approach to safeguarding individuals against exploitation and harms posed by AI.

 

It’s going to be difficult – especially given the power of big tech giants, issues pertaining to access to justice, and the general craziness of Twitter and Tiktok these days … but I believe the law can evolve to better protect women and girls, while still respecting the amazing potential that AI has to transform media, entertainment, and cultural heritage.

 


A coaster with "Kelsey Farish DSC IWD 2024 Inspirational Feminist Speaking Truth to Power places her drink on this coaster" written on it.
As a thank you for speaking at the event, I received this awesome coaster!

 

 


89 views0 comments
bottom of page