With the rapid advancement of AI, the ability to create ‘deepfake’ pornographic images is cheap, easy and available. All that’s needed is a photograph of a person and explicit images that appear to be them can be created – and shared on social media, degrading and devastating them. Some of the victims are New Zealand schoolkids, but will the law move quickly enough to address it? By Gill Higgins.

Watch this story on our home for news, in-depth and consumer stories TVNZ+.

Websites that allow anyone to create explicit pornographic images have proliferated in the last few years. The marketing is unapologetic. The deepfake sites lure men and boys with lines like “Imagine taking the time to take her out on dates when you can just use the website to get her nudes?”.

Over a dozen websites now enable even the most technologically challenged person to “undress” someone, as long as they have a photo of them. These sites can also create images that place that unknowing victim’s image in all sorts of explicit poses. It’s disturbing enough that such images can be created and used for personal gratification, but if they are then shared on social media, they can have a devastating effect on victims.

Creating and posting deepfake porn without consent is a form of image-based abuse that Auckland-based tech expert and journalist Finn Hogan says is already out of control. “The cat is out of the bag with this technology, the bar to create a tool like this is getting lower and lower every day, and the output from these models is getting more and more sophisticated.”

Tech expert and journalist Finn Hogan

What does that look like in New Zealand? Thankfully, not the same as it does in South Korea. Women there are protesting in the streets, while the police play catch up with undercover stings. A South Korean news outlet reports that 1415 people have been arrested since 2021. In one case, teenagers had sold over 1000 images on a telegram.org chat room.

NZ schoolgirls devastated

New Zealand is not at that crisis point. But we already know of two schools reeling from the spread on social media of deepfake pornographic images of students. Fifteen of those students were at a school in North Canterbury, and around 50 at another location we’ve been asked not to name.

Both schools declined to share any details, saying their focus was to help students move on, as many still felt violated. But one of the schools gave permission to Vaughan Couillaut, the President of the Secondary Principals Association of NZ (SPANZ), to speak on its behalf, as he’d supported the school through the ordeal.

Couillaut didn’t mince words when asked about the impact of the incident. “It was biblically large. Lots of emotions, and they’re still dealing with it now. It doesn’t go away with a couple of sessions with the councillor. It is ongoing and problematic because things loop around, it’s very hard to get things off the internet”.

School kids are very vulnerable to this form of technology.

The schoolgirls weren’t from just one culture and that added an extra layer of complexity. “ Where you’ve got cultures that deal with the purity of their young women differently to other cultures, the consequences can be greater, ‘do I stay in this community? Do I stay in this class?’”. Coulillaut adds, “it’s not as low level and as simple as the perpetrator might think it is. It’s not cheap entertainment. It’s life damaging work.”

With the click of a mouse

But it’s so ridiculously easy. I met with Catherine Abel-Pattinson who’s Chief Operating Officer at Netsafe. She’s a highly experienced company director but nothing had prepared her for what I was asking her to do.

I wanted to see how simple it would be to create a deepfake porn image and at the same time, discuss the impact image-based abuse could have on victims. Catherine was the perfect person for the role. She was reluctant at first but came round to the idea saying she sees the need for New Zealand to have this conversation.

Gill Higgins and Catherine Abel-Pattinson

Our discussion took far, far longer than the time it took to create an explicit image. For a start, within milliseconds, a search on Google produced a list of sites that enable the creation of deepfake porn images. Netsafe’s chief online safety officer, Sean Lyons, had mentioned the technology really is easy to find and use anytime, anywhere. “It could be a young person on the bus coming home from school. It could be a group of friends at the mall who have heard of this and think it’s hilarious. Could be an adult sitting at home. You couldn’t tell the difference between somebody doing this and somebody playing Wordle.”

It’s important to note though that safety settings on Google are an effective block of these kinds of sites.

For our purposes, those settings were off, and we chose the first website we came to, one we know gets at least 4 million views every month. Click on the site and you’re taken to a set of rules. You must be over 18 to enter the site. You must have permission to use the photos you’re about to manipulate. Stern stuff. Except all you need to do, all anyone needs to do, is click “Accept” and away you go.

Click “Accept” and away you go.

‘You could really damage someone’s life’

You just need to watch my piece on TVNZ+ to see how viscerally shocked I was by the images that then appeared. I wasn’t expecting examples to pop up in the way you might see some examples of bestsellers if you open up your favourite clothes store.

They were explicit and, without safety blocks, available to any child or teenager to see. And on this site, you can manipulate your first nine images for free. Just scrape images of any person from Instagram, download, get them “undressed”, place their image in disturbing, explicit positions, and those deepfake porn images are ready to share.

We took a photo of Catherine Abel-Pattinson at work in her smart office attire. It looked a bit like a picture for an ID card except we included more than her head and shoulders. With the photo downloaded, we pressed “undress” and waited for her clothes to be removed.

‘There I am, naked’

It was so quick and so much more real looking than we’d expected. “There I am naked, and honestly it looks like me, absolutely 100% I would say that’s what I look like”. It’s not your usual experience to sit with a stranger and see yourself appear without clothes. But this was a tame version. It was with Abel-Pattinson’s consent, and she trusted me. How would she feel if she saw that image in a social post, out of the blue, and knew it had possibly been shared elsewhere?

“I think as a woman, you’d feel hysterical, and if you’re a teenager, I think you’d feel really hysterical, particularly if they threatened to send it around the school. This is a real threat, and unfortunately, I can see this happening more and more, because this is just so easy to do. I mean, it took me five minutes, and you could really damage someone’s life.”

‘One of the great problems of our time’

Catherine Abel-Pattinson’s workplace Netsafe is an organisation set up in New Zealand to help prevent online harm and to help victims navigate their way through what can be a frustrating process to get images removed. So she works with people who deal with this problem all the time. She says they are already seeing the effects of deepfake porn being used without consent.

“Every day we answer the phone and we have suicidal people on the other end, because stuff like this has been sent. And it’s not them.”

“We answer the phone and we have suicidal people on the other end.”

Catherine’s image was “only” undressed, but there are options for creating very explicit sexual poses. Given schoolgirls are among those who’ve been affected, some as young as eleven years old, this is a grave concern.

Finn Hogan agrees. He fully embraces the world of AI but is equally fearful of its power in the wrong hands. “This is one of the great problems of our time, because there are very few technical solutions to be able to tell what is deepfake and what is not”.

Shut one down, another opens

He’s also well aware that the companies allowing the use of this technology won’t be stopped. Some countries may take bold moves – just this week, a lawsuit was filed in San Francisco in a landmark case attempting to shut down 16 of the most used websites for creating deepfake porn. But the internet is international. And Hogan says shutting one down will only pave the way for another to open .

In New Zealand, police say the crime of posting sexual material without consent is on the rise. AI will only make it easier. At present, statistics only cover sexual material as a whole. Fake sexual images aren’t counted separately. Still, the numbers show a doubling of proceedings taken by police in the year from 2022 to 2023, with the biggest increase seen in very young offenders. In those aged 10 to 14, proceedings jumped from just one, to 16.

Legal shortcomings

Youth are dealt with using warnings or through the youth court. Adult offenders can be dealt with by police. But to a certain extent, police hands are tied when it comes to prosecuting offenders for image-based sexual abuse. This is because real and AI images aren’t legislated against in the same way.

For real images, intent to cause harm isn’t required. This is thanks to an amendment to the Harmful Digital Communications Act in 2022. However, the amendment didn’t include images created by AI. This is despite the fact they can look real and can cause real harm. Basically, legislation hasn’t kept up with technology.

Arran Hunt is an Auckland based lawyer at McVeagh Fleming, specialising in technology and AI issues. He says the legal profession saw this situation coming and he was one of several who argued strongly to include AI-created pornographic material. He’s disappointed. Other countries like Australia and the UK are way ahead and have this covered.

Where does this leave NZ victims?

For a criminal prosecution for AI images, the police need to be able to show “an intent to cause harm by making the communication”. The police then run the prosecution and cover the costs. Plus there’s the possible outcome of a conviction, up to three years in jail and a $50,000 fine.

But if that intent can’t be proven for the creation or sharing of explicit AI images created without consent, the only recourse is civil action. Meaning the victim needs to pick up the bill themselves. Hunt says “the richest party often wins, it can take years often without an outcome, and the court can only order a takedown and an apology. That’s it.”

Hunt says that’s often the only option as intent to cause harm can be hard to prove. “We see people who don’t have a modern view of what we’d see as being harmful, and they’d go, it’s a bit of a laugh. Why not? Without realising they’re actually really hurting someone emotionally. They’re destroying them.”

"They're actually really hurting someone."

Hunt would like to see New Zealand catch up with other countries “A relatively simple change to the Act would actually cover this quite nicely”.

Government’s lack of urgency frustrates

Justice Minister Paul Goldsmith acknowledges that AI porn created without consent is causing real harm. But he doesn’t seem to be in any hurry to act “it’s an issue raised a few times and the initial advice I’ve received is that the legislation covers a lot already but certainly this is something on our radar further down the line”.

Justice Minister Paul Goldsmith.

Hunt isn’t impressed. In fact, he’s worried. He fears it will take a big salacious story in the media involving a teenage victim self harming for the government to see the urgency. “We already have people who are getting harmed now, why wait for that one big story to actually fix things?”

Watch Gill Higgins’ full in-depth report on what, and who, is behind our growing fake porn problem on TVNZ+

Share.