My Tweet About AI Porn Went Viral, And What It Taught Me Was Upsetting

we've all reacted to the AI porn that portrays people who don’t actually exist. but what would your reaction be if the face on the fake porn was yours?
River Page

Editor's Note: The arrival of mature artificial intelligence promises no shortage of miraculous possible future technologies, from the virtual assistant who saves your life to the perfect, free doctor, adaptive learning at scale, and the one-man media company 1,000 clones strong. But it is also now possible to simulate realistic photos, of real human beings, in any scenario you can imagine. Let’s translate that to the language of our gutter internet: you can now look at pictures of anyone you know — celebrity, enemy, girlfriend of your very best friend — having sex. We have absolutely no idea how to manage the cultural fallout of this phenomenon.

An internet drama surrounding several social media influencers embroiled in a sexfake controversy is now catching fire across Twitch and Twitter. It is a bizarre, and incredibly important story for some reason entirely ignored by the press. Perhaps because most “tech” reporters have no idea what artificial intelligence even is, let alone what the arrival of artificial intelligence means.

In any case, we’re on it. River Page reports.

-Solana

---

Earlier this week, Twitch Streamer Atrioc faced a backlash for purchasing AI deepfake porn of two fellow streamers, one of whom was the massively popular Pokimane. The ensuing controversy spiked traffic to the site where the deepfakes were being sold, leading to other female Twitch streamers noticing that the creator of the Pokimane deepfakes had also made porn featuring their likenesses. In the aftermath, one such streamer, QTCinderella, went live on Twitch, where she cried throughout the entire three-minute stream, choked on her words, repeatedly said she knew she shouldn’t be doing this, but that she “want[ed] to show people what pain looks like.” The video was raw.

Screencap of QTCinderella’s deepfake reaction stream

“Fuck the fucking internet. Fuck Atroic for showing it to thousands of people. Fuck the people DMing me pictures of myself from that website. Fuck you all! This is what it looks like, this is what the pain looks like.” She weeped, her face red, tears streaming down.

“To the person that made that website, I’m going to fucking sue you. I promise you, with every part of my soul I’m going to fucking sue you. That’s all I have to say, I know I shouldn’t have gone live. But I couldn’t do it. I’m so exhausted and I think you guys need to know what pain looks like ‘cause this is it. This is what it looks like to feel violated. This is what it feels like to be taken advantage of, this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.”

There was no catharsis. She ended the stream looking just as wounded as she did at the start. You can tell she hadn’t slept much before the stream, and wouldn’t sleep much after. It was brutal.

The video affected me deeply. I felt awful for her, and assumed everyone would. But when I logged onto Twitter later that day, this clearly wasn’t the case. Someone had posted a screengrab from the same video I’d just seen with the caption “Millionaire internet streamer's reaction to AI porn of herself. You won't find more fragile people than popular internet personalities (especially women).” Bewildered, I quote tweeted it, saying — hyperbolically — that “if you can’t understand why someone would feel violated and upset by this you should be in jail.” The tweet went viral, which was a surprise for me. I had originally been working on a piece exploring AI’s potential impact on the porn industry, as told by porn creators, and that’s still forthcoming, but the sheer number of responses to my tweet created what amounts to a virtual focus group on deepfake porn, and the results are worth exploring. 

Although the type of hyperrealistic AI porn that portrays people who don’t actually exist in real life, like in these images, has dominated the discourse over the last few days, it rarely elicits the frank, emotional responses from regular people that deepfake porn does. People can just make jokes about the mistakes a computer makes when it tries to create a woman from scratch — the imperfections, the extra fingers. But yesterday, people in my mentions responded quite differently when confronted with the possibility that the face on the fake porn could be theirs. They day-dreamed about it. And base human emotions came out, such as violent fantasies of revenge directed at whoever would create these kinds of deepfakes. 

People insisted that something be done. People insisted nothing can be done. One asked, it’s the internet, been here long? People called for laws. The French government banned the tweet I quote tweeted.

Some noted that people have been making fake porn of celebrities for a long time. Others pointed out how massively the technology has improved, how realistic deepfakes could be nowadays, especially if seen by someone who wasn’t familiar with the person in them. Some people said others need to be able to accept the consequences of “living their lives in full publicity.”

The identitarian clown car arrived. The misogyny I quote tweeted provoked misandry in response. “Not all men. But… somehow always a man.” Ethnonarcissism arrived at nine in the morning in the form of a Persian feeling sorry for himself and mocking Westerners for finding “a brand-new, high-level complexified way of “feeling violated.”

Multiple people (to the misandrists' credit, virtually all of them men) were bewildered that anyone would actually be bothered by finding out there was deepfake porn of themselves online. Basically, some version of: if it’s not real, and I know it's not real, why would it matter? Some said they’d be flattered; I believe them. A pattern emerged of men so deprived of sexual attention that they actually did seem to see QTCinderella’s plight as enviable in some way.

AI is making it clear that pure, unqualified techno-optimism is, at least in part, the domain of sociopaths, incels and autistics. From what I’ve seen here, they’re the only ones who don’t seem to have any reservations about where things are headed. Everyone else is scared, angry, or despondent. They think something must be done, or that nothing can be done.

Perhaps something could be done, but nothing has been so far. There’s a federal revenge porn law that allows victims of nonconsensual porn to file lawsuits against perpetrators, but the law doesn’t address deepfakes specifically. A federal law should be in place. Will it stop deepfake porn? Not completely. Federal law hasn’t eliminated the production and distribution of child pornography either, but the enforcement of those laws has driven the practice to the extreme margins, and has attached a heavy cost to participating in the trade. Such a law could make deepfakes rare, and limit their distribution, in turn limiting the harm inflicted onto victims. People would be wise to demand such a law.

Until then, things will only get worse.

-River Page

0 free articles left

Please sign-in to comment