Can we fight deepfake porn?

I suspect that ‘banning’ AI-generated nudes is an implausible project

porn
(Getty)

The foul-mouthed puppet musical Avenue Q, way back in 2003, caught the spirit of the age to come. “The internet is for porn!/ The internet is for porn!” runs one of its more memorable songs. “Why do you think the net was born?/ Porn! Porn! Porn!” Never was a truer word sung by a copyright-skirting knockoff of The Muppet Show. What those puppets didn’t see coming, though, was the effect on this basic truth of the development of artificial intelligence.

A report this weekend in the Wall Street Journal came headlined: “AI fake nudes…

The foul-mouthed puppet musical Avenue Q, way back in 2003, caught the spirit of the age to come. “The internet is for porn!/ The internet is for porn!” runs one of its more memorable songs. “Why do you think the net was born?/ Porn! Porn! Porn!” Never was a truer word sung by a copyright-skirting knockoff of The Muppet Show. What those puppets didn’t see coming, though, was the effect on this basic truth of the development of artificial intelligence.

A report this weekend in the Wall Street Journal came headlined: “AI fake nudes are booming. It’s ruining real teenagers’ lives.” The story opened with a case study. A twenty-six-year-old YouTuber called Gabi Belle discovered that, though she’d never posed nude, upwards of one hundred sexual images of her were circulating on internet porn sites. They had been generated by AI — which can now extrapolate convincingly from clothed photographs of people to show what they would look like undressed, or edit their faces into pornographic videos.

Unless I’m missing something, deepfake pornography is going to be an unavoidable feature of twenty-first-century life

Those who make it their business to keep an eye on these things report that in the last five years the number of AI-generated fake nudes in circulation has gone up almost threefold. More AI-generated fake porn videos have already been posted in 2023 alone than in the six years previously. On porn message-boards, anonymous altruists promise “I can fake your crush.” Respondents say who they’d like to see in the buff — celebrity or next-door neighbor — and within moments a reply containing the image appears. We were all worried “deepfakes” would be used for political messaging, but Avenue Q had it right. Apparently a full 96 percent of all deepfake images are pornographic.

You could think of this as twenty-first-century tech giving a new lease of life to a widespread practice that in the twentieth century used to be called mentally undressing people. It is a longstanding recreation of teenage boys (among others) to imagine what people they fancy would look like with no clothes on. I daresay for as long as teenage boys exist in their present form it’ll be part of their lives. It just happens to be the case that these days, instead of writing off for the X-ray spectacles advertised in the back of old Fantastic Four comics, they have access to computer programs that will make a pretty good guess, and what’s more paint them a picture. It’s icky, and in addition to causing them to fall into the ancient sin of Onan it also commits the modern sin of objectification. But it’s hard to imagine it going away.

We must deal with the facts of the world as they are, rather than as we would prefer them to be. And I suspect that “banning” AI-generated nudes through local or even international law, through codes of conduct or terms and conditions, is a project up there for feasibility with King Canute’s marine experiments.

As the Washington Post’s report admitted, there’s nothing on the federal statute books in the US, and not much state-by-state, to control deepfake pornography. It’s tricky, indeed, to think what an effective set of laws against it would look like. As we’ve already seen, AI runs rings round existing intellectual property laws. This case is no exception. Deepfake filth will likely circumvent such copyright protections as exist for individual likenesses, since even when they resemble a particular person their images will be created from vast datasets. And in how many cases, realistically, is the victim going to have the money or patience to put the proposition to the test by litigation?

The internet isn’t completely unpoliceable, but it’s very tricky to police; and tricker still to police at scale. Look at the success of our efforts so far in combating political disinformation, incitement to violence, ads for mobile phone games that totally misrepresent the gameplay, and quotes attributed to Gandhi that he never said. What hope is there of plugging the firehose of deepfake porn?

It’s hard enough sifting for illegal images in the vast mass of traditional, slow-food pornography of the filmed-in-real-time-and-involving-humans type. AI is well on its way to being able to churn out feature-length never-happened bongo, featuring your favorite celebrity or your study partner in A-level chemistry, faster than all the humans alive can watch it, let alone issue takedown notices and follow up on them. So unless I’m missing something, horrid though it may be, deepfake pornography is going to be an unavoidable feature of twenty-first-century life.

That’s bad. I don’t minimize it. It will be deeply unpleasant, as a self-conscious teenage girl or a shy teenage boy, to find the internet contains artificially generated, and plausible, pictures of you naked or worse. There will be a special mortification if these images are being WhatsApped back and forth by the sniggering thugs among your classmates or targeted to humiliate you. But I think there may be a silver lining.

One of the many vile features of the digital age, for two or more decades now, has been “revenge porn.” Couples who merrily shared fruity images with each other would fall out, and one or other of them, or a third party who came into possession of them, would then spitefully share those images without permission. IClouds were hacked and nudes posted on Reddit. Blackmail sometimes entered into it. Several people have died by suicide after sexual images of them made their way onto the open internet.

But the special cruelty of revenge porn is associated with the certainty that anyone who sees these images — friends, parents, colleagues — will believe that they are real. The value of the original leaked sex tape, One Night In Paris, to the repulsive lowlife who profited from it was that ghouls and pervs would pay to see the actual, real Paris Hilton in an intimate situation without her permission.

If making deepfake porn is something anyone can do, to anyone, in a few keystrokes, nobody’s going to be in a position to tell what’s real or what’s not. The bottom will fall out of the market. If someone emails you to say that unless you pay them a hundred bitcoin they’ll send a video of you deep-throating a horse to your mom on Facebook, you will be able to tell them to go whistle. Indeed, you could send them your own version of the video by reply, and tell them publish and be damned.

Speaking of the bottom falling out of a market, what about real pornography? Once AI can produce, at near to no cost, plausible images of anyone you want doing anything you want to anyone else you want, who but the most perversely retro self-abuser is going to seek out the expensive, unsatisfactory, old-fashioned sort? Who’s going to turn a profit making it? That seems to me, broadly, a good thing. As my colleague Julie Bindel and many others have been arguing for years, the porn industry is built on exploitation and abuse. It humiliates and degrades its participants, and it humiliates and degrades the users who are complicit in that abuse.

The second part of that equation is still a matter for concern. The purchase funnel that leads heavy porn users to ever more extreme content (all that choking and spitting and rape fantasy), and which has been shown to damage their psychological health and sexual function, may not be improved by algorithms that will be quite incapable of saying: “I think this goes a bit far.” But at least it will not have to be supplied by real, often addicted, often trafficked, “performers.”

There’s a whole separate argument about what one earth you do with, say, artificially generated pornography that includes underage children or sexual violence. It may not generate obvious real-world victims, but it’s a strong stomach that would say the appetites that demand it should be encouraged. Still, it’s a step in the right direction, surely, to live with AI images and the appetite for abuse than be trying to deal with real images and the fact of abuse. Deepfake porn, if we can’t get rid of it, is something we may have to make the best of.

This article was originally published on The Spectator’s UK website.

1 Comments
Share
Text
Text Size
Small
Medium
Large
Line Spacing
Small
Normal
Large