
Title: Teen victim of deepfake pornography urges Congress to pass ‘Take It Down Act’
A 15-year-old girl’s nightmare has become a call to action for change. Elliston Berry, who had her personal Instagram photo manipulated into explicit content without her consent, is now advocating for the “Take It Down Act” in Congress.
Berry shared her story with CBS News, revealing that she was devastated when she discovered classmates were sharing images of her on Snapchat after someone used an artificial intelligence program to remove her dress from a innocent picture. The incident occurred 15 months ago and has been a constant reminder of the devastating consequences of deepfake pornography.
Last year alone, more than 21,000 deepfake pornographic videos surfaced online, representing a staggering 460% increase over the previous year. Websites are now using this technology to create explicit content without the consent of those involved, leaving victims like Berry feeling humiliated and helpless.
The disturbing trend has prompted lawmakers to take action. In an effort to curb the spread of deepfake pornography, Republican Texas Senator Ted Cruz and Democratic Minnesota Senator Amy Klobuchar have joined forces to propose the “Take It Down Act.” If passed, this legislation would require social media companies and websites to immediately remove non-consensual, pornographic images created using AI.
“I can’t go back and redo what he did, but instead, I can prevent this from happening to other people,” Berry expressed, emphasizing her determination to bring about change.
Source: http://www.cbsnews.com