Elon Musk's X is blocking searches for Taylor Swift.
Explicit AI-generated pictures of the singer went viral on the platform last week.
The images have drawn condemnation from figures including Microsoft CEO Satya Nadella.
When searching Swift's name, users get an error message saying: "Something went wrong. Try reloading."
X did not immediately respond to a request for comment from Business Insider, made outside normal working hours.
However, X's head of business operations, Joe Benarroch, told BBC News the action was taken "with an abundance of caution as we prioritize safety on this issue."
X appeared to acknowledge the spread of the fake images in a post on Friday, saying it was working to remove them and "closely monitoring" the situation.
"Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content," the company said on X's safety account.
Swift is reportedly furious about the images and considering legal action, per the Daily Mail. The images have drawn international condemnation, with public figures criticizing the pictures and their spread.
Microsoft CEO Satya Nadella called the images "alarming and terrible" and called for more "guardrails" on the tech in an interview with NBC Nightly News due to be broadcast Tuesday. The White House also called the images "alarming."
White House press secretary Karine Jean-Pierre said: "We know that lax enforcement disproportionately impacts women, and they also impact girls, sadly, who are the overwhelming targets."
She added that while legislation should play a role in tackling the nefarious use of AI, social media platforms should also ensure they banned harmful AI-generated content.
"We believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual, intimate imagery of real people," Jean-Pierre said.
Read the original article on Business Insider