A private members bill which would criminalise non-consensual deepfake porn has been drawn from the ballot in Parliament.
The bill’s author, Act’s Laura McClure, has welcomed the news that her proposed legislation will go before MPs.
She said: “Parliament now has the opportunity to empower victims of deepfake abuse with a clear pathway toward the removal of the images and the prosecution of their abusers.”
1News reported last week on a victim of AI deepfake porn, who suffered widespread humiliation caused by a man she once rejected in high school, and was able to see him charged and prosecuted.
This is believed to be first prosecution for deepfake porn under New Zealand law.
After first deepfake porn prosecution, are we equipped for AI onslaught? – watch on TVNZ+
The images depicted the victim with family and friends. In some, she was just a young teen. In all of them, she was completely, and convincingly, naked while everyone else remained fully clothed.
Under current law, police can only prosecute if they can prove intent to cause harm. In this case, which also included three other young women, the offender had confessed and eventually pleaded guilty. He will be sentenced early next year.
In other countries, the sharing of explicit deepfake images is treated as a crime, whether or not there’s proof of intent to cause harm.
McClure’s bill would change that. It amends the Crimes Act 1961 and the Harmful Digital Communications Act 2015, expanding the definition of “intimate visual recording” to explicitly include images or videos that are manipulated by AI without consent.
“The harm is real and it’s happening right now,” she said.
Private members bills allow individual MPs to propose legislation, but they need to gain the support of parties to become law. When 1News asked each party about McClure’s bill, only Te Pāti Māori would confirm its support at this stage.
The Deepfake Digital Harm and Exploitation Bill is set to be introduced to Parliament next year.