'It was all just abuse': How pornography fuels violence

A purple keyboard button with the word 'porn' on it.

Government moves to ban deep fake pornography. Source: Getty / Diy13

Australia is in the midst of a reckoning, with nationwide calls to address gendered violence. Banning the creation and distribution of non-consensual deep fake pornography is one of a suite of measures announced by the federal government to tackle the issue. Experts say it's one piece of a broader jigsaw puzzle required to drive systemic change.


Listen to Australian and world news, and follow trending topics with

TRANSCRIPT

Sarah's (her name and voice has been changed for anonymity) former partner of 14-years was addicted to porn.

She says it put an enormous strain on their relationship.

"I was first exposed to porn early on in the relationship. I'd never watched porn before. I'd never been with anyone that watched porn before. It didn't make me feel really comfortable. And I asked that, you know, that we didn't have to watch it. I asked for the porn to be removed from the home and I was promised that it was. I had my suspicions, but I never really had evidence that he was still watching, and I found in his social media account he was getting messages from a friend of his that always had porn content in it and I addressed that, and he told me that it would stop, but then he was obviously deleting messages."

In 2015, it got to the point where Sarah says she was photographed while having sex with her partner, without her consent.

"I found a hidden camera in my bedroom...I was in bed and it was quite a big TV in the bedroom and it was on the dressing table, but it was sort of hidden behind the TV. And I just happened to sort of, there's a bit of a gap between the TV and what it sits on, and I could just see this box and I thought, what's that? And it was a camera.”

Upset and angry, Sarah questioned her husband's reasoning for having the camera in their bedroom.

Seeking to reassure her, Sarah says he said it was for security and swore that it hadn't been pointing towards her, but instead, towards the ceiling.

“And then I'd found photos of myself in a locked file that I got into that he'd taken without consent. So I don't know how they could have been taken with a phone camera.”

She says the situation left her feeling violated.

"I'm going to cry sorry... I couldn't believe that it was happening to me. Anxious, depressed, fearful. And then angry. Because it had happened more than once, it happened quite a few times."

Over the years, Sarah proceeded to find other cameras - in the form of alarm clocks, a car remote, a Bluetooth speaker and reading glasses.

"You know, there was an alarm clock with a hidden camera in it. And I remember the alarm clock being in my bedroom. And, you know, his excuse for that: 'Oh, you know, I was gonna put that there, and you wanted security there', and (he was) just a master at gas-lighting. So it would make me feel, 'Oh, well, okay, maybe it's true. Maybe he did get that for over there,' because I was just constantly gas-lighting with everything'. It was always, 'Oh, no, but you wanted that'. Or he would try and weave me into it somehow where I would think it was my fault. So I had years of that. It's abuse. It was all just abuse."

Sarah has now left that situation, but says the trauma remains.

"It was hard to leave because he's not-- he's very passive aggressive.... You know he can be nice and... I found it difficult... My whole life was wrapped up with him... But I've left. I don't intend on going back. It's not going to change. Yeah, I can't go back... And I'll be constantly, like, I still do-- I'm always looking for cameras, so I check everything."

Sarah is not alone in her experiences.

A national survey conducted by the Queensland University of Technology's School of Justice found high pornography exposure among young Australians helps fuel violence and violent attitudes towards women.

The survey, published in the Australian and New Zealand Journal of Public Health, found the average age of first porn exposure was 13.2 years for males and 14.1 years for females, among young people who had seen pornography.

And with the rise of artificial intelligence, concerns are now being raised about the non-consensual spread of deep fake pornography.

eSafety commissioner Julie Inman Grant says she's received multiple reports of young people being bullied by having purely synthetic pornographic images of them shared.

And with Australia in the midst of a reckoning, with a woman being killed every four days with many allegedly at the hands of a current or former partner, it's prompted nationwide calls for change.

Prime Minister Anthony Albanese has since announced the government will introduce new laws to ban the creation and distribution of non-consensual deep fake pornography.

Mr Albanese says it's one of a suite of measures aimed at addressing gendered violence against women and combating toxic male extremist views online.

"We will introduce legislation to ban the creation and distribution of deep fake pornography, sharing sexually explicit material using technology like artificial intelligence will be subject to serious criminal penalties."

Those criminal penalties could include up to six years jail time.

Attorney-General Mark Dreyfus told SBS News in a statement that, "These reforms will create a new offence to make clear that those who seek to abuse or degrade women by creating and sharing sexually explicit material without consent, using technology like artificial intelligence, will be subject to serious criminal penalties. We will also strengthen existing criminal offences to ensure that they apply to deep fake images."

So what are deep fakes?

The word comes from the combination of deep neural networks and fake images.

But actually, deep fakes are fake images, video, or audio that have been created by a computer program.

And the technology is advancing rapidly.

The University of Melbourne's Jeannie Paterson, who leads the university's centre for AI and Digital Ethics, explains.

"In previous times, to create a realistic deep fake video, it was done often by merging two images. So we might have a face of a public figure, perhaps Barack Obama, and then a different mouth saying different things would be merged with that image of Barack Obama. And then a voice over would be added to create actually a famous deep fake of Barack Obama criticising Trump. That was done effectively by merging a combination of images and audio."

That technology has since moved on.

“Now generative AI can create realistic images from text. So deep fake images now, such as the ones that came out recently with the Met Gala, with celebrities that weren't there wearing costumes they didn't wear, can be created actually through quite straightforward methods. And it's what we call a synthetic image; it's entirely created. It's not merging two previous images. It's actually creating a new image that resembles a famous figure doing things they would never have done, or an ordinary person doing things they would never have done.”

Things get a little more complicated when it comes to deep fake pornography.

Professor Paterson says while adults are entitled to create entirely synthetic images for their own pleasure or amusement, where it becomes offensive is when another person's face or identity is appropriated into those images WITHOUT their consent.

"Unfortunately, some of the earliest uses of Deep Fakes were for pornography or intimate image abuse often. So deep fake pornography often has the face of a person, that the person who's creating the pornography often wants to humiliate. And it's added to other pornographic images. So, and the real criticism there is with deep fake pornography is when it's used to humiliate and embarrass or offend people who never consented to these images being created or shared."

Seeing that generative AI has just sprung onto the market, it's difficult to get an exact figure on its prevalence online.

But Professor Patterson says they've heard instances of it happening in workplaces, schools and in the general public.

She also says that from the perspective of the victim, it shouldn't matter how it's created, or the frequency at which it is shared.

"What matters is that it's kind of realistic and that it's used to humiliate or intimidate or embarrass them. So, it's the use that's problematic rather than the creation of-- if you're the victim of a deep fake porn scam, you don't really care whether it's made by merging two images together in a clumsy way, or if it's been entirely synthetically generated using generative AI, you are hurt anyway."

So, how will this be policed and what are the current regulations that exist in Australia?

The answer, isn't quite so clear cut.

Australia has the Online Safety Act 2021.

Under Section 75 of the Act, posting or making a threat to post an intimate image of a person without their consent online is a civil offence.

Australia also has the Defamation Act 2005, which can be used to file a case against the perpetrator that has created these fake images.

RMIT University lecturer in Information Systems and Business Analytics, Dr Shahriar Kaisar, says when it comes to policing deep fake pornography, it's still a complex problem requiring complex solutions.

"It's very difficult to determine who is responsible for the creation and sometimes it could be people from overseas as well. And sometimes they could just use something like a VPN connection to distribute. So then they become anonymous. So who are you going to take this action against?"

Given that this is an evolving technology, the guardrails are constantly shifting.

The internet is awash with apps and online tutorials on how to create deep fake images, making them increasingly accessible.

Further muddying the waters, when an image is distributed many times online, it can be hard to trace back who created the original image itself.

Dr Kaisar says there are places where people can go to find information, in several different languages.

"The Australian Cybersecurity Center, they publish periodic guidelines around all sorts of different cybersecurity issues, not only on deep fakes, but they provide generic guidelines around how to spot scams as well as covering all sorts of different materials."

Experts say the banning of deep fake pornography is just one piece of the puzzle when it comes to addressing some of the deeper cultural issues surrounding gendered violence.

Movember's global director of research Dr Zac Seidler says it requires a multi-pronged approach.

"I don't think the porn is at the centre of this. I think that there are way more important social determinants that are kind of going on. I think porn inflames all of those. So when you've got this substance misuse stuff going on, you've got a trauma history, you've got education issues, you've got poverty, you've got all of those things going on, and then you've got free open access to constant, complicated, problematic, misogynistic content that is really going to turn up the volume on the problem that are all of those existing determinants."

One such organisation seeking to address these issues is the Man Cave, a preventative mental health program for teenage boys.

The Man Cave runs workshops, camps and presentations at schools throughout Australia with a focus on exploring healthy masculinity and emotional literacy.

James Lolicato is the General Manager of Operations and says he sees many young men who went through the COVID epidemic - and have lived much of their school lives on a screen - who are increasingly attracted to negative influences online.

"And instead of having healthy masculine male role models that they can look up to within their school or their current environment, they're turning to these other resources, hearing what they have to say. And a lot of what they have to say verges on misogynistic, verges on facets of masculinity that are no longer the healthy versions of masculinity that we want to see young people grow up with."

He says because of this, many young men are confused on how to grow up and become healthy role models.

"And that's where we're seeing a real increase in domestic violence situations. One woman every four days is being murdered at the hands of their current or former partner, and it requires a whole of society change, a societal change that comes from a preventative mindset. How do we reach the young men before these toxic behaviors come up? Before they start existing? To really showcase what it means to be a healthy male role model to those around them, not just younger generations at their school, but also their friends, their family and those people who are looking up to them."

Movember's Dr Seidler says a major systemic and structural shift is needed.

"None of the men who, you know-- I've worked with plenty of guys in attempting to rehabilitate them post violence, and none of these men are happy. None of these men want to commit violence...No boy grows up wanting to hurt those that they love most. And I think we need to hold onto that. That is the rationality of the humanity of this entire piece here, which is that the way in which we treat our children has direct ramifications on the men that they grow up into...This is all interlinked. Homelessness, violence, mental health. These cannot be siloed approaches. They need to be understood as all overlapping. And we will solve a lot of our societal ills if we start to take that approach."

Share