Skip to Content
Home | news | Press Releases

Press Releases

Dingell's TAKE IT DOWN Act Signed into Law

Congresswoman Debbie Dingell’s (MI-06) bipartisan TAKE IT DOWN Act, which protects victims of non-consensual intimate imagery, was today signed into law by President Donald Trump. The TAKE IT DOWN Act protects victims of real and deepfake ‘revenge pornography' by criminalizing the publication of these harmful images, in addition to requiring websites to quickly remove them. The rising popularity of AI requires decisive federal legal protections that will empower victims of these heinous crimes, most of whom are women and girls.
 
The bill was also led by Reps. María Elvira Salazar (R-FL), Madeleine Dean (D-PA), Vern Buchanan (R-FL), August Pfluger (R-TX), and Stacey Plaskett (D-VI), and Senators Amy Klobuchar (D-MN) and Ted Cruz (R-TX).
 
“The increasing use of artificial intelligence to create and circulate deep fake pornography threatens the well-being and security of its victims, primarily women. Perpetrators have used deep fake pornography as a tool to harass, humiliate, and intimidate women and children online, and we need to work together to protect against these threats. This is a serious and growing issue that requires urgent action, which is why I introduced the TAKE IT DOWN Act,” said Rep. Dingell. “This law will provide a critical remedy for victims to ensure these images are removed and that perpetrators are held accountable. As new technology emerges, so too does the potential for new forms of abuse. I’m grateful for my partners in the House and Senate who helped get this bill across the finish line and passed into law, and I will continue to work with everyone, on both sides of the aisle, to respond to emerging technological threats, and protect victims of sexual violence.”
 
The TAKE IT DOWN Act solves the problem of inconsistent, or non-existent, legislation protecting victims of deepfake pornographic images at the state level. While nearly all states have laws protecting their citizens from revenge porn, only 20 states have explicit laws covering deepfake non-consensual intimate imagery (NCII). Among those states, there is a high degree of variance in classification of crime, penalty, and even criminal prosecution. Victims also struggle to have images depicting them removed from websites in a timely manner, potentially contributing to more spread and re-traumatization.
 
In 2022, Congress passed legislation creating a civil cause of action for victims to sue individuals responsible for publishing NCII. However, bringing a civil action can be incredibly impractical. It is time-consuming, expensive, and may force victims to relive trauma. Further exacerbating the problem, it is not always clear who is responsible for publishing the NCII.
 
The TAKE IT Down Act addresses these issues while protecting lawful speech by:

  • Criminalizing the publication of NCII or the threat to publish NCII in interstate commerce;
  • Protecting good faith efforts to assist victims by permitting the good faith disclosure of NCII for the purpose of law enforcement or medical treatment;
  • Requiring websites to take down NCII upon notice from the victims within 48 hours; and
  • Requiring that computer-generated NCII meet a ‘reasonable person’ test for appearing to realistically depict an individual, so as to conform to current First Amendment jurisprudence.
Back to top