The term ‘Taylor Swift AI’ was trending in various regions on X (formerly Twitter), with one post gaining more than 45 million views before it was eventually removed
Pic courtesy/ Taylor Swift's Instagram account
Elon Musk- owned social media platform has temporarily stopped searches for 'Taylor Swift' after disturbing deepfakes of the singer surfaced on X earlier this week, according to Hollywood Reporter.
ADVERTISEMENT
While the social media platform returned no results for the search terms "Taylor Swift nude" or "Taylor Swift AI," however, benign-sounding searches, such as "Taylor Swift singer," still gets results.
The move follows White House weighing in on the controversy on Friday, urging Congress to "take legislative action."
Press Secretary Karine Jean-Pierre told ABC News, "We are alarmed by the reports of the circulation of images that you just laid out -- of false images to be more exact, and it is alarming ... While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people."
New York Congressman Joe Morelle is using the deepfake nudes of the 34-year-old singer to push for legislation making the nonconsensual sharing of digitally altered sexual photographs a federal criminal, as per the Hollywood Reporter.
X and other social media platforms attempted to remove the Swift photographs from their platforms after they became popular on Wednesday, but fresh Swift-inspired imposter images that differed from the originals began to circulate in their place, potentially complicating enforcement.
The original photos showed Swift in red body paint sexually cavorting with Kansas City Chiefs fans, mocking her relationship with Chiefs tight end Travis Kelce. The Chiefs face the Baltimore Ravens in a vital AFC Championship game on Sunday, which will determine whether the club advances to the Super Bowl.
SAG-AFTRA released a statement on Friday, stating that the photos "are upsetting, harmful, and deeply concerning."
"The development and dissemination of fake images -- especially those of a lewd nature -- without someone's consent must be made illegal," the union continued. "As a society, we have it in our power to control these technologies, but we must act now before it is too late."
"SAG-AFTRA continues to support legislation by Congressman Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy."
This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever