Dive Brief:
- President Donald Trump is expected to sign the Take It Down Act, a bipartisan, bicameral bill criminalizing the use of artificial intelligence to generate deepfake nude images without the depicted person’s consent. The bill, which passed in the House on Monday, was touted by Trump in his March address to Congress.
- The bill also states that any person who publishes a deepfake image of a minor would be subject to fines, imprisonment for up to three years or both. Social media and related websites would be required to remove such content within 48 hours of a victim’s notice.
- The issue of deepfake pornography began to plague schools after generative AI tools made it easy for students to generate the harmful images of their peers. In fact, advocates of the Take It Down Act include a high school student whose classmate targeted her with deepfake pornographic imagery.
Dive Insight:
The Take It Down Act’s passage comes as more states push their own legislative efforts to combat illicit deepfake images.
However, state laws targeting deepfake pornography varied widely and resulted in “uneven criminal prosecution,” said the U.S. Senate Committee on Commerce, Science & Transportation in a Monday statement.
In April, for instance, New Jersey Gov. Phil Murphy signed civil and criminal penalties into law for the creation and dissemination of “deceptive audio or visual media” or deepfakes. Under that law, creating or sharing deepfake images and audio is a third-degree crime that could lead to imprisonment and a maximum $30,000 fine. That law was inspired by a New Jersey high school student, Francesca Mani, who was targeted by deepfakes as a teenager.
A California law enacted in September 2024, meanwhile, deems the distribution of deepfake images depicting an “intimate body part” as a misdemeanor subject to maximum imprisonment of one year and a $2,000 fine.
Indeed the threat of deepfake images in schools has been anxiety-inducing for some students — particularly girls, who are often more likely to be harassed with fake nude images.
A September 2024 report by the Center for Democracy and Technology revealed that schools have primarily focused on harsh penalties for students who share sexually explicit deepfake images online. During the 2023-24 school year, 71% of teachers reported that when their schools caught students spreading deepfake “non-consensual intimate imagery," punishments ranged from suspension, expulsion or referrals to law enforcement.
But teachers indicated to CDT that schools still lacked supports for students victimized by deepfakes. Just 5% of respondents said their school provided resources to victims to help remove the images from social media or other online platforms.
The passage of the bipartisan Take It Down Act is a “historic win” for victims of deepfake abuse, said the bill’s author, Sen. Ted Cruz, R-Texas, in a Monday statement. “By requiring social media companies to take down this abusive content quickly, we are sparing victims from repeated trauma and holding predators accountable.”
Sen. Amy Klobuchar, D-Minn., echoed in a Monday statement that deepfake images can “ruin lives and reputations.”
“We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse,” said Klobuchar, who co-sponsored the Senate bill.
The bill passed the House on Monday in a sweeping 409-2 vote following its Senate passage in February.
Still, the legislation brings pause to Becca Branum, deputy director of CDT’s Free Expression Project.
"The TAKE IT DOWN Act is a missed opportunity for Congress to meaningfully help victims of nonconsensual intimate imagery,” Branum said Tuesday in an emailed statement. “The best of intentions can’t make up for the bill’s dangerous implications for constitutional speech and privacy online.”
If the bill passes, the Federal Trade Commission would be tasked with its enforcement. Trump recently fired two of the FTC’s Democratic commissioners who have since sued the administration over the matter.
“Empowering a partisan FTC to enforce ambiguous legislation is a recipe for weaponized enforcement that risks durable progress in the fight against image-based sexual abuse," Branum said.