AI Can See You Naked

AI Can See You Naked

The terrifying world of synthetic porn

CW: Explicit discussion of sexual exploitation, rape

Christchurch has produced many good things over the years. It has also produced Michael James Pratt, a fugitive sex trafficker on the FBI’s most wanted list with a $70,000 bounty on his head. A database of coerced nude photos and videos collected by Michael and his team is at the core of the deepfake porn business, where anyone’s face, including your own, can be put on these women’s bodies.

Michael moved to California in 2007 and started a porn business, one that landed his best mate in prison and traumatised hundreds of women over the course of a decade. His website, GirlsDoPorn (GDP), peddled videos of women coerced into sex acts, who were promised that the videos would remain private. They did not. The videos were heavily edited to remove any traces of coercion, and while the website was shut down in 2019, those videos are still being used to train artificial intelligence (AI) bots to generate synthetic porn. These algorithms are forcing a rethink of the way that we legislate pornography.

Synthetic porn relies on deepfake technology, which operates a bit like a snapchat filter. It’s how Disney put Princess Leia back in Star Wars, how Russian media concocts fake news clippings, and how anyone with access to your photos can put you in a porno. The technology has virtually exploded over the last five years, and as early as 2017, Reddit users were using it to create porn of whoever they liked. Its legal standing is still up in the air, but as of today, anyone with a few bucks can pay an app for custom-made, AI-rendered porn. Of anyone. Including you. Distributing that porn is another matter - but more on that later.

In order to learn how to build a real-looking human, the AI first needs reference material. This is true for any AI. If you want an AI to draw a carrot, you first have to show it thousands of images of carrots. DALL·E Mini, the internet’s current favourite AI, gets its dataset by scouring the entire internet for everything matching “carrot”, and then using a second AI to stitch those images together into its best approximation of what it thinks a “carrot” is. If you typed in “naked woman in bed” in DALL·E, you’d get a Lovecraftian horror of a human body, because the AI is referencing the entire internet’s collection of “naked women in bed”, which is far too variable to yield a convincing result (we’ve included an example of this, which was not sourced from illegal porn).

This is because human beings are much more complicated than a carrot. We have limbs and folds, and besides, the human brain is much better at noticing tiny abnormalities in a human face than an inanimate object (which is why the people in Polar Express looked so weird but the train and trees were fine). In some more high-performing, publicly available AIs, you can’t draw porn – you’ll get a content warning if you try. But if you look at all the other submissions people are trying to draw, you’ll see that people are trying to get around the filter by requesting images of scantily-clad anime characters, or celebrities in their underwear. But by and large, the AIs struggle to make a perfect human body. In the screenshot below featuring Megan Fox, the user who requested the image actually had to specify that he wanted a “symmetrical face.” The AI just has too much data to pull from, and gets confused. So if you want a machine to learn how to render human beings, you need to give it the most precise dataset possible. You’d need to feed it thousands of photos of human bodies in various, but uniform, positions. You’d need a dataset like the ones found on GirlsDoPorn.

The collection of smut films made by Michael for GDP were all filmed in a semi-uniform way, so when a Reddit user found this trove, alongside an also-illegal and remarkably-uniform database from Czech Casting, it seemed like the perfect training material to build a new, porn-centric deepfake AI. Czech Casting provided more images than GDP, and is arguably an even worse offender, but its videos are still available online. These images were downloaded for AI training before the GDP site was shut down, so the creator of the AI wasn’t aware they were illegal. But they are - and they’re still being used. 

The first results of this AI were very messy, and barely resembled human beings. But that was in 2017, and a lot has changed since then. Today, there are multiple deepfake porn AIs, which go beyond just photos: full-length videos of celebrities (or anyone with heaps of online material to reference) are readily available, possibly protected from the law by titles like “NOT Scarlett Johansen having a threesome” (the Google searches for this research, by the way, were genuinely awful. We’re probably on every watchlist from here to Canada, and we’ll let you know when the FBI comes knocking). But it’s not just celebrities; this software can take an image of your face, pick out focal points, and map a projection of your face onto another body. In some cases, this body is real, pulled from an existing porno. In other cases, the entire body is a fake, built by an algorithm trained to mimic human form and movement. It will match your skintone, your body type, and even your expressions. These algorithms rely on a massive database of real, human photos, which is where our scumbag Christchurch friend comes in. 

To build this collection, Michael moved to California. He had had a stint of porn-making in Aotearoa, and was joined in 2011 by his mate Matthew Isaac Wolfe, who evidently was in Christchurch for the earthquake, said “fuck it” and gapped it overseas while his countrymen cleaned up the mess. During the first ten years, their company made 17 million US dollars by selling videos of women being coerced, drugged and raped. The pleas of victims to remove the videos were ignored by the company, and no action was taken by the law for a decade.

Michael, Matthew and a contracted-cock named Andre Garcia ran the GDP operation. Their business model was simple: they’d put out ads attracting girls aged 18-20 with the prospect of an all-expenses paid trip to San Diego for a modelling gig. Once in the hotel room, the boys would inform the model that it was actually a porn shoot. Court records show that they’d promise that the videos would remain in private collections, not to be distributed, and that the girls would be paid $5,000. Somewhere in these proceedings, their lawyer (who we can only assume is deaf) described how their “charming” Kiwi accent helped them coerce women. 

Warning: the content in the next paragraph is disturbing. 

It was all a lie. Many girls wanted to back out of the shoot, but were physically blocked from leaving. They were threatened with exposure online and with financial consequences. Once coerced into sex, often under the influence of drugs or alcohol, records state that many cried out for help, or begged to leave the room. Andre, the porn model, under the direction of Michael, would then proceed to rape them. All of this was recorded and published online, freely available to anyone with an internet connection, including a Reddit user by the name of u/GeneratedPorn. He took these videos, along with the ones from Czech Casting, and used them to train an AI. All of the images produced by this specific AI are based on videos of women being raped, and since the world of deepfake porn is so insular, it’s possible that others are still using the database.

Samantha Cole, Senior Editor of Vice’s Motherboard, has covered this story extensively. Her list of publications includes wonderfully dystopian titles like “AI-Assisted Fake Porn Is Here and We’re All Fucked” and “It Takes 2 Clicks to Get From ‘Deep Tom Cruise’ to Vile Deepfake Porn”. She didn’t respond in time for this story, but her research goes deeper into this rabbithole than anyone’s. Samantha was in touch with the creator of the AI porn subreddit, who decided to shut down his project just one week before her extensive story on AI porn was published. 

Before he shut down the project, the user had been in touch with plenty of other people. He detailed the sites he used to source his training material, including the videos shot by our fuckhead friends from Christchurch (By the way, Michael, if you’re reading and you’re upset at the names we’ve called you, feel free to call or mail us. I’m sure the FBI would love to hear from you). We tried to dig up more info on Michael, by Facebook-stalking everyone we knew from Christchurch with the last name “Pratt”, but neglected to reach out to strangers asking if any of their relatives had recently been put on the FBI’s most wanted list for sex crimes. Something about that felt inappropriate.

After the GDP site was shut down in 2019, and after Michael’s co-conspirators were arrested, AI porn really took off. And this brings us to today. Just last week (August 2022), while Matthew was being sentenced to 20 years in American prison and Michael remains on the run, a popular face swap app was creating AI renders of anyone a customer wanted. Based on its Reddit connections, it’s possible that this entire operation revolves around a dataset of women’s bodies collected - in part - by sex traffickers from Czech Casting and two Kiwis from Christchurch. Until the database used by this app is made public to prove if GDP content is in there, it is unknown if these renders are built upon the naked backs of traumatised women. And this app is still legal to download on the app store - the law hasn't caught up. But while the law has lagged, the coding has exploded. These images and videos are a far cry from the garbled mess of 2017; they have movement, they have sound, and they all have the potential to cast you as the star role.

So what happens in that case? What happens when you open your phone to see yourself starring in synthetic porn? Critic Te Ārohi sat down with Professor Colin Gavaghan, who researches law and emerging technologies. Colin, with a very charming Glaswegian accent, gave us the good news: “If you found [porn of yourself] online, I think you'd have a fairly slam dunk case.” New Zealand is somewhat ahead of the curve in this area; our Harmful Digital Communications Act may be well-suited to adapt to this new type of crime. But what if you outsourced the production to a country without such legislation? Some other countries don’t have this type of legislation in place at all, so “You could get a civil order to have the content taken down, and maybe a criminal prosecution as well,” said Colin. “But of course, that all gets tricky when it’s happening in a different jurisdiction.”

This type of crime, making AI porn of somebody without their consent, is about control. It’s about taking over the autonomy of another human being, and stripping them of their dignity as well as their clothing. Olivier Jutel, in the MFCO Department, explained a school of thought called the Californian Ideology. It essentially says that all of computing technology, down to its very core, is a relationship between a slave and a master. The computer is our digital slave, constructed to abide our every whim, and we play the role of a sort of digital God, “Thomas Jefferson on the digital plantation”. It’s no surprise, then, that these fucked-up fantasies surface online, in an environment that is centred completely on themes of dominance and subordination.

Colin reckoned that the first court case about AI-generated, non-consensual porn would end up between three pillars of existing law: privacy law, defamation law, and the Harmful Digital Communications Act. No one branch of legislation would specifically apply to this case, but any one of those three branches could step in and claim it as their jurisdiction. “But my pick”, said Colin, “is that if it got to a court and the court said, ‘no, the current law doesn't apply’, it wouldn't be very long before we had a new law that did.” But Tom from the Brainbox think tank offered a different reality. He cited a 1 News article that found that Netsafe received four complaints of deepfakes in 2020 alone – including at least one instance of synthetic porn. It’s already here. He also said that the government neglected to adopt a change in legislation that would guarantee deepfake porn to be included under the HDCA umbrella. As it stands, the HDCA is set up to address things that are recorded with a camera – synthetic porn could be a loophole.

So what happens next would all depend on what the porn actually depicts, and if it were shared. If the content was completely synthetic, if it didn’t depict you specifically, it might be quite tricky to build a case against it; a case could be raised if the image was generated by a database of illegal pornography, like the ones built from GDP or Czech Casting. “If it was an image of a non-existent person, but made from images of real people, which were acquired by subterfuge or coercion,” explained Colin, “then I think there wouldn't be much difficulty [making a case]. I think you should probably be able to establish that that's illegal.” If you can’t prove that the source images were illegal, though, then that’s another story. But in this case it’s a made-up person, unrecognisable as anyone real, so it’s less likely to be harmful. That being said, it opens the door to even scarier options: what happens if someone makes an AI porno of an illegal act? Say, one depicting a minor? Is that still illegal, if it’s completely synthetic? 

But let’s say that the image isn’t synthetic, let’s say that it’s an image of you. “It would have to be more than just fake, it would have to be fake and harmful,” said Colin. “So yeah, if someone - God help them - wanted to create a digitally rendered image of me, naked, if they were that desperate, could they take a picture of me on the street and then put it through this kind of filter? I'm gonna have to say the law is not clear.” Personal privacy has not been tested in this way, and if the image is never shared, we simply do not know what would happen. That job falls to you, as students; Colin said that the current group of law students will be the ones to engage in the legal arena with AI-generated, image-based abuse. Some of them, like the ones behind the Brainbox think tank, already are. If you want to get involved with this space, Tom said that Brainbox is always looking for new voices, and suggested you reach out.

New Zealand is ahead of the curve when it comes to regulating digital crime, but has failed to fully get ahead of the deepfake porn problem. But at the end of the day, according to Colin, if you find a pornographic image of yourself, if it’s something that obviously causes harm, “you have a pretty good chance [of taking it to court].” What exactly is a pretty good chance? Why is it not a certainty? Well, to put it simply, there hasn’t been a court case yet. But when there is one, it will “fall between these two areas that laws typically deal with: actual private information about us on the one hand, and false claims about us on the other hand. This isn't really either of those things.” An AI-rendered porno of you is a new image, it’s not a photo someone took up your skirt or in your dressing room. It’s not an invasion of privacy in that way. It’s also not necessarily claiming to be real, so it’s not necessarily defamatory. But it certainly feels wrong, somehow, and it’s probable that the courts would agree. We don’t know who will be the victim, and there’s little we can do to stop it. Until that happens, or until the government gets ahead of the curve, all we can do is wait.

This article first appeared in Issue 20, 2022.
Posted 4:32pm Friday 19th August 2022 by Fox Meyer.