After a shocking incident involving explicit photos that depict Taylor Swift circulating online, US lawmakers are urging strict legal measures to stop the production and distribution of these kinds of content.
The fake images that used AI techniques to modify Swift's appearance has sparked widespread criticism and calls for regulation.
US Congressman Joe Morelle has expressed strong protests against the incident, describing the spreading of these images to be "appalling."
The social media site X has responded by eliminating the images as well as block search terms associated with the fakes, which include "Taylor Swift AI" and "Taylor AI." Despite all this an image was watched by 47 million people before it was removed.
The rapid growth in the production of fake material, that has increased to 550% in the year 2019 as per a recent study, highlights the importance of this problem.
In the US there are no federal laws specifically dealing with fake content, though certain states have begun to act. The UK is in contrast has made sharing of fake pornography a crime under the Online Safety Act 2023.
Representative Morelle, who had previously introduced legislation titled the Preventing Deepfakes of Intimate Images Act and highlighted the devastating emotional reputational, financial, and emotional damage that deepfakes cause especially for women who are most affected.
In fact The State of Deepfakes report from the year before revealed that the majority of people targeted in the deepfake pornography have been women.