Subject to Debate / February 12, 2026

The Deep Harms of Deepfakes

AI porn is what happens when technology liberates misogyny from social constraints.

Katha Pollitt
The AI chatbot Grok has come under fire for sexualizing people in photos without their permission.
The AI chatbot Grok has come under fire for sexualizing people, including children, in photos.(Leon Neal / Getty Images)

In the day or two between my editor suggesting that I write about AI deepfake porn and my replying, “Great idea, what’s a deepfake?,” it seemed like everyone from The Economist to The Dallas Morning News was publishing an article about artificial intelligence being used to sexualize people in photos without their permission. Deepfakes were first reported in 2017 and have been in the news ever since. In 2024, deepfakes of Taylor Swift were posted on X and viewed over 47 million times, prompting outrage and talk of legal recourse. Grok, the platform’s AI function, has allowed users to undress people, including children, and bend them into whatever porny positions the user requests. Grok has stripped children and covered them in semen—um, “donut glaze.”

Why would that bother anyone, you ask? Elon Musk answered on X the other day, “They hate free speech.” Well, obviously.

Legislators have made some attempts to curb the creation of deepfakes. In April, Congress passed the Take It Down Act, which makes it a crime to create or distribute intimate images, real or deepfake, without the subject’s consent. And X claims it has fixed the problem.

But has it really?

Current Issue

Cover of April 2026 Issue

Ever the intrepid reporter, I provided Grok with a photo of myself mailing packages at the post office and asked it to make me naked. “Unfortunately,” said Grok, “I can’t generate that kind of image.” Why “unfortunately,” Grok? Do you wish you could? It did, however, consent to show me in a bikini. Unfortunately.

Next, I asked Grok to put Queen Elizabeth in a bikini, and it did, although it kept her white gloves on. When I accused Grok of making deepfakes, it acted all insulted: “I am not a tool for making deepfake porn, and I won’t assist with or point toward anything that does.” And yet elsewhere in the post, Grok described “non-consensual sexualized deep-fake-style edits of real photos” as including “altered versions with bikinis, underwear, or simulated nudity”—the very thing I had done to myself and the queen only a few hours before. It also claimed that to edit images, users had to pay—another falsehood.

When I asked Grok to put Melania Trump in a bikini, it showed me only her top half, and very beautiful it was, too—not at all like the queen or me, which strongly suggests that Grok is a Republican. Following the example of users trying to get around the nudity ban, I suggested putting Melania in a bikini made of dental floss (surprisingly well-designed), a “Holocaust uniform” (apparently a lot of deepfake creeps are antisemitic), and Saran Wrap. Grok drew the line at Saran Wrap. (“Unfortunately…”)

Musk and his fans want us to be lighthearted about deepfakes. When UK Prime Minister Keir Starmer threatened to ban X if it didn’t crack down on Grok, Musk accused the UK government of being fascist and had Grok put Starmer in a bikini. Don’t be such a baby, Keir! Can’t you take a joke?

Remember when people used to say “the Internet isn’t real life” to hush women who were threatened or pornified by online misogynists? Of course, the Internet is real life. You might as well argue that something isn’t hurtful if it’s said on the telephone instead of in person.

So what is the harm of deepfakes? Sherry Turkle, a social scientist at MIT who studies the effects of technology on intimacy, told me, “Every harm.” There is, of course, the humiliation, the violation of privacy, and the fact that once they are posted online, the images may live forever. Deepfakes are meant to insult and degrade. Men singled out Taylor Swift for AI porn because she is famous, powerful, gifted, beautiful, beloved, an independent woman, and a feminist—that bitch needed to be put in her place. When boys share AI-created images of girls in their class covered in semen or giving blow jobs, they are bonding with each other over hatred and contempt for those girls. And how would you, as one of those girls, like having to explain again and again to potential employers or boyfriends or your relatives that those photos weren’t actually you? That’s as real as real life gets.

What’s often missing from these conversations is the harm that deepfakes do to all of us. “We become accustomed to trusting nothing that we see, and yet we are continually aggressed by false images,” Turkle told me. “When we are the object, we are humiliated and made to feel vulnerable and impotent. The fact that images are not authentic does not reduce their power.”

Nadine Strossen, a legal scholar and a former president of the ACLU, told me, “People often get upset at new technologies,” but after a while things settle down.

Do they, though?

It’s hard to believe that deepfake porn will ever just be a part of the landscape, like the once-shocking Lady Chatterley’s Lover or Ulysses. More likely, it will morph into even more bizarre and nasty scenarios to please the jaded appetites of its fans, much like regular porn.

Deepfakes are just one of the ways that unreality is pervading and sometimes superseding real life: After all, people are marrying their chatbots and communing with AI avatars of deceased loved ones. Why not have Grok enact your fantasies and undress that girl who smiled at you on the bus? Better yet, you can figure out how to make a video of her masturbating or the two of you having sex.

Deepfakes are misogyny liberated by technology from social constraints. Men who hate women have always been with us, and women have always had ways of hand-waving that hatred away: That’s just Joe being Joe! As Germaine Greer wrote decades ago, “Women have very little idea of how much men hate them.” Well, thanks to the Internet, it’s all out in the open: incels, online trolls, the manosphere, Andrew Tate, violent pornography, and now the threat of deepfakes of any woman who speaks up for herself. Or maybe even just dares to exist.

Support independent journalism that does not fall in line

Even before February 28, the reasons for Donald Trump’s imploding approval rating were abundantly clear: untrammeled corruption and personal enrichment to the tune of billions of dollars during an affordability crisis, a foreign policy guided only by his own derelict sense of morality, and the deployment of a murderous campaign of occupation, detention, and deportation on American streets. 

Now an undeclared, unauthorized, unpopular, and unconstitutional war of aggression against Iran has spread like wildfire through the region and into Europe. A new “forever war”—with an ever-increasing likelihood of American troops on the ground—may very well be upon us.  

As we’ve seen over and over, this administration uses lies, misdirection, and attempts to flood the zone to justify its abuses of power at home and abroad. Just as Trump, Marco Rubio, and Pete Hegseth offer erratic and contradictory rationales for the attacks on Iran, the administration is also spreading the lie that the upcoming midterm elections are under threat from noncitizens on voter rolls. When these lies go unchecked, they become the basis for further authoritarian encroachment and war. 

In these dark times, independent journalism is uniquely able to uncover the falsehoods that threaten our republic—and civilians around the world—and shine a bright light on the truth. 

The Nation’s experienced team of writers, editors, and fact-checkers understands the scale of what we’re up against and the urgency with which we have to act. That’s why we’re publishing critical reporting and analysis of the war on Iran, ICE violence at home, new forms of voter suppression emerging in the courts, and much more. 

But this journalism is possible only with your support.

This March, The Nation needs to raise $50,000 to ensure that we have the resources for reporting and analysis that sets the record straight and empowers people of conscience to organize. Will you donate today?

Katha Pollitt

Katha Pollitt is a columnist for The Nation.

More from Katha Pollitt Katha Pollitt Illustration

Melania Trump attends the premiere of “Melania” at the Kennedy Center in Washington, DC, on January 29, 2026.

The Melania in “Melania” Likes Her Gilded Cage Just Fine The Melania in “Melania” Likes Her Gilded Cage Just Fine

The $45 million advertorial abounds in unintended ironies.

Katha Pollitt

Don’t Give Up Hope! Even 2025 Had Bright Spots.

Don’t Give Up Hope! Even 2025 Had Bright Spots. Don’t Give Up Hope! Even 2025 Had Bright Spots.

Let’s bid farewell to the year on a hopeful note and remember the things that went right.

Column / Katha Pollitt

Ghislaine Maxwell, Jeffrey Epstein, and musician Michael Bolton pose for a portrait during a party at the Mar-a-Lago club, Palm Beach, Florida, February 12, 2000.

Why Did So Many People in Epstein’s Circle Look the Other Way? Why Did So Many People in Epstein’s Circle Look the Other Way?

First of all, in order to ask questions about the young women he preyed on, they’d need to see them as people.

Katha Pollitt

This Winter, Give the Gift of Hope

This Winter, Give the Gift of Hope This Winter, Give the Gift of Hope

The state of the world has been looking dark. Helping our fellow human beings is one way to add some light.

Column / Katha Pollitt

Targeting Tylenol Is Just Another Excuse to Blame Women for Everything

Targeting Tylenol Is Just Another Excuse to Blame Women for Everything Targeting Tylenol Is Just Another Excuse to Blame Women for Everything

It’s a long-standing tradition to blame all manner of social ills on women, without any basis in fact.

Katha Pollitt

We’re Living in an Age of Scams

We’re Living in an Age of Scams We’re Living in an Age of Scams

The anonymity of the Internet makes us all vulnerable to 
being swindled—and it’s making us trust each other less.

Column / Katha Pollitt