Searches for “Taylor Swift” on Saturday are drawing a blank space on Twitter/X, as the social network is no longer on showing any search results for the singer after a proliferation of pornographic AI nudes flooded the service earlier this week.
The service also shows no results for “Taylor Swift nude” or “Taylor Swift AI” — the search terms widely used to circulate the illicit images. But benign-sounding searches, such as “Taylor Swift singer,” still pull up results.
More from The Hollywood Reporter
The move follows the White House weighing in on the controversy on Friday, urging Congress to “take legislative action.” Press Secretary Karine Jean-Pierre told ABC News, “We are alarmed by the reports of the … circulation of images that you just laid out — of false images to be more exact, and it is alarming … While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people.”
While New York’s congressman Joe Morelle is citing the Swift nudes to push to pass a bill making nonconsensual sharing of digitally-altered explicit images a federal crime.
X and other social media platforms attempted to scrub their platforms of the Swift images, which went viral on Wednesday, but some new Swift-inspired fake images that were different from the originals began to circulate in their place, possibly making enforcement more difficult.
The original images depicted Swift in red body paint sexually cavorting with Kansas City Chiefs fans, mocking her romance with Chiefs tight end Travis Kelce. On Sunday, the Chiefs play their pivotal AFC Championship game against the Baltimore Ravens which will decide which team goes to the Super Bowl.
So far, Twitter/X boss Elon Musk has been uncharacteristically silent on the issue, instead commenting Saturday on matters such as a San Francisco toy store closing due to the city’s crime issues. Swift hasn’t issued a comment, either, though unverified reports have claimed the singer is considering legal action.
On Friday, SAG-AFTRA also issued a statement on the images, writing that they “are upsetting, harmful, and deeply concerning.”
“The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” the union continued. “As a society, we have it in our power to control these technologies, but we must act now before it is too late. SAG-AFTRA continues to support legislation by Congressman Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”
The deepfake issue also a concern heading into a contentious presidential election year. Recently a robocall spoofing Biden’s voice was deployed to try and influence Tuesday’s New Hampshire primary.
Best of The Hollywood Reporter