Taylor Swift’s name is currently unsearchable on X, days after deepfake pornographic and violent images of the pop star went viral on the site.
As of Saturday afternoon, the message “Something went wrong. Try reloading,” is what comes up for users attempting to search her name.
So far, there’s been no word from X on the new development, but the company did address the deepfakes in a statement on Friday.
“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them,” the company said.
The sexually explicit images using Swift’s likeness were allegedly created using artificial intelligence, and garnered more than 27 million views in just 19 hours, before X suspended one account which posted them, NBC News reports.
Many of the images contained a watermark that suggested they were connected to a website known for making fake nude photos of celebrities, and even has a sectioned titled “AI Deepfake.”
In an effort to overwhelm the explicit photos, Swift’s supporters responded by sharing positive images of the songstress, with the hashtag #ProtectTaylorSwift.
Some commended X’s decision to make her name unsearchable on Saturday, with one user calling it the “first step done to safeguard her.”
The White House also took notice of the incident, with Press Secretary Karine Jean-Pierre telling ABC News on Friday that they are “alarmed” by what happened to Swift online and that Congress “should take legislative action.”