Jodie, a 27-year-old from Cambridgeshire, received an anonymous email with three links to an online forum, containing hundreds of photos of her, along with lewd comments and requests for people to rate and fantasize about her. The person posting the pictures had also invited others on the forum to create sexually explicit “deepfakes” of her using artificial intelligence. The images were set up with her own photos, holiday snaps, and photos with friends and housemates, and featured her in a schoolgirl outfit, being raped by a teacher.
Jodie was horrified and reported the incident to the police, only to be told there was nothing they could do. This incident led to a breakdown, and she was left with long-lasting emotional trauma.
The issue of deepfake abuse is growing, with reports increasing by 400% since 2017. The Revenge Porn Helpline, which has seen a surge in cases, believes that stronger laws are needed to tackle this issue. The government has announced a crackdown on explicit deepfakes, promising to make creating and sharing them without consent a criminal offense. However, soliciting deepfakes is not set to be covered.
The helpline has noticed that many perpetrators of deepfake image abuse appear to be motivated by “collector culture,” sharing and trading content for sexual gratification or status. Jodie’s experience highlights the need for better awareness of deepfake abuse, not just among the public, but also among the police.
The Revenge Porn Helpline has partnerships with major platforms to help victims remove abusive content, and they can also use facial recognition technology and reverse image-search tools to detect and remove content. However, police response can vary, with some forces citing lack of understanding of the legislation or the public interest.
Jodie is calling for a change in the law, stating that the current loophole, which allows for solicitation of deepfakes, must be closed. She argues that the issue is not just about monsters or weirdos but about “normal people doing this” and that the law must be watertight and black and white to ensure accountability.
The core of the issue lies in the monumental impact deepfake abuse can have on victims, as seen in Jodie’s case. Years on, she still lives in constant fear that the images might still be circulating, and it has affected her friendships, relationships, and her view of men in general.