Taylor Swift Searches Blocked By X Following Viral Graphic AI-Generated Images

Deepfake images of Swift have been scrubbed from X/Twitter in addition to search terms blocked.

X, the social media platform formerly known as Twitter, has seemingly blocked all search results for Taylor Swift on Saturday in the wake of sexually explicit AI-generated images flooding the site earlier this week. According to The Hollywood Reporter, specific search inputs — particularly those that would be used to circulate the graphic AI images — brought up no results. Other variations of search with Swift's name, including "Taylor Swift music" and "Taylor Swift singer" pulled up result about the artist and her work.

The social media site's apparent blocking of certain searches pertaining to Swift is the latest response to the pornographic deepfake images that were proliferated on the site earlier this week. X, as well as other social media platforms, tried to remove the images after they went viral, though new AI deepfakes of Swift popped up after that. At the time of this article's writing, X boss Elon Musk has made no comment about the situation. However, the White House weighed in on the issue on Friday.

"We are alarmed by the reports of the … circulation of images that you laid out — of false images to be more exact, and it is alarming … While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual intimate imagery of real people," Press Secretary Karine Jean-Pierre told ABC News about the situation.

SAG-AFTRA also issued a statement, calling for the "development and dissemination of fake images" without consent to be made illegal.

"The development and dissemination of fake images — especially those of a lewd nature — without someone's consent must be made illegal," SAG-AFTRA said in a statement. "As a society, we have it in our power to control these technologies, but we must act now before it is too late. SAG-AFTRA continues to support legislation by Congressman Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy."

X's Safety Team Has Issued a Statement About The Deepfakes

On Friday, X's safety team shared a statement indicating that they were removing the identified images as well as "taking appropriate actions against the accounts responsible for posting them." One such account, which had a post sharing the deepfakes that had been viewed more than 45 million times, was suspended on Thursday.

"Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content," the statement read. "We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We're committed to maintaining a safe and respectful environment for all users."

0comments