Taylor Swift AI Nudes Provoke Fandom Uproar on X: “Disgusting as Hell”

Pornographic, nude images of Taylor Swift that were generated using AI have been circulating on social media, provoking massive outrage among her fans and beyond.

The fake images — deepfakes — portrayed Swift in several sexualized positions while partially clad in Kansas City Chiefs garb among hordes of fellow Chiefs fans (a reference to her romance with Chiefs tight end Travis Kelce).

More from The Hollywood Reporter

According to The Daily Dot, the images were seen at least 22 million times before X cracked down on the posts. Swift fans rallied to report the images and also flooded the social media site with positive posts about the singer that contained keywords such as “Taylor Swift AI” to try and drown out the lurid fakes, causing the topic to trend Thursday morning. The images were also reportedly found on Facebook, Instagram and Reddit, which have similarly been trying to clamp down on their spread.

One account in particular has been cited as causing the images to go viral, though it’s unclear if the user also created the images.

“I’ve repeatedly warned that AI could be used to generate non-consensual intimate imagery,” wrote Democratic Virginia Senator Mark Warner on X. “This is a deplorable situation, and I’m going to keep pushing on AI companies to stop this horrible capability and on platforms to stop their circulation.

The images made the rounds just as an alleged Swift stalker was arrested three times within three days around the singer’s New York City home.

On Friday, SAG-AFTRA issued a statement on the images, writing that they “are upsetting, harmful, and deeply concerning.”

“The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” the union continued. “As a society, we have it in our power to control these technologies, but we must act now before it is too late. SAG-AFTRA continues to support legislation by Congressman Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”

Around the same time, White House Press Secretary Karine Jean-Pierre said during a news briefing that the images of Swift were “very alarming.”

“Too often, we know that lax enforcement disproportionately impacts women,” Jean-Pierre continued.

“While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing, enforcing their own rules to prevent the spread of misinformation, and non consensual, intimate imagery of real people,” Jean-Pierre said, citing the President’s implementation of “the first national 24/7 helpline for survivors of image-based sexual abuse,” which launched this fall.

Some of the reactions among Swift fans:

“the taylor swift ai images are insane. actually terrifying that they exist. please report + don’t give more attention to those tweets. some of these men really need to be locked in a cage and shipped off to mars or smth” wrote one user.

“Protect her, don’t missuse tech. Taylor Swift AI is as disgusting as hell Please PROTECT TAYLOR SWIFT,” wrote another:

“I hope there’s a place in hell specially reserved for people who thinks making explicit photos of women is ok and acceptable. I hope that hell is scary and torturous,” wrote another.

“The amount of bad arguments I’ve seen in favor of shit like the Taylor Swift Ai pics is baffling. “You only care cus it’s Taylor!” No actually it’s bad when it happens to anyone. It was bad when mfs did it to Tom Holland too I didn’t forget that shit. But go on keep deflecting” wrote another:

“taylor swift fans are genuinely amazing. AI porn of her goes viral and they mobilize with over 200k posts to protect Taylor, calling for action, and getting the accounts that distributed the porn suspended. they literally accomplish stuff our legal system can’t,” wrote another:

So far, Twitter/X boss Elon Musk has been silent on the issue, and Swift hasn’t issued a comment.

In October, President Joe Biden issued an executive order in October to try and prvent “generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals.” Nonconsensual deepfake pornography is also illegal in several states. In addition to sexualized images, deepfakes are also being increasingly used by duplicitous businesses to depict celebrities endorsing products.

The deepfake issue is a particular concern heading into a contentious presidential election year. Recently a robocall spoofing Biden’s voice was deployed to try and influence Tuesday’s New Hampshire primary.

Jan. 26, 4:20 p.m. Updated with statements from SAG-AFTRA and the White House.

Best of The Hollywood Reporter