The social media platform X — formerly Twitter — is blocking some searches of Taylor Swift after fake pornographic images of the pop star started circulating online.
Searches of the singer's name on the platform generated an error message Monday asking users to retry their search. But putting quotation marks around her name did allow some posts to appear, though not the lewd images that have stirred controversy.
The sexually explicit photos of Swift, which were reportedly generated using artificial intelligence, show her in various vulgar positions at what appears to be a Kansas City Chiefs game, a reference to her relationship with the team's tight end Travis Kelce.
Taylor Swift greets Super Bowl-bound Travis Kelce with a kiss
She and Kelce walked arm and arm around the field afterward, long after many of his teammates had gone to the locker room.
The nonconsensual photos garnered millions of views just hours after they were posted, before the account that originally shared them was reportedly suspended. However, the images continued to circulate on other accounts.
After the photos started making their rounds on social media, some of the pop star's fans joined an online campaign to "protect Taylor Swift." They have been busy reporting the explicit content and posting actual pictures of the pop star in hopes of making the fake images more difficult to find.
Without naming Swift, X appeared to respond to the campaign to get the images taken down.
"Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content," a statement from the X Safety account said. "Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed."
How to tell the difference between a deepfake video and a real one
There are some things to pay attention to when identifying a deepfake video, but artificial intelligence is making detection more difficult.
With the emergence of AI, and the technology now being widely accessible, this surely won't be the last time that explicit deepfake images circulate online. However, there are steps being taken to mitigate the potentially harmful impact or influence they can have.
President Joe Biden in October signed an executive order to regulate AI and manage its risks — one of them being to protect against the use of nonconsensual intimate imagery of real individuals. Reps. Joseph Morelle, D-N.Y., and Tom Kean, R.-N.J., also reintroduced a bill last week called the Preventing Deepfakes of Intimate Images Act that would make sharing such fake explicit images a federal crime.
While the future of the bill remains unclear, lawmakers have said they plan to tackle AI in bite-size pieces this year.