On May 19, 2025, President Donald Trump signed the bipartisan “Take It Down Act” into law, establishing a federal framework to combat the distribution of non-consensual intimate imagery, including AI-generated deepfakes. This legislation criminalizes the act of knowingly publishing or threatening to publish explicit images without the subject’s consent, marking a significant step in addressing digital exploitation.
Introduced by Senator Ted Cruz (R-TX) and co-sponsored by Senator Amy Klobuchar (D-MN), the bill garnered overwhelming bipartisan support, passing the House with a 409–2 vote and unanimously in the Senate.
Even First Lady Melania Trump played a role in the signing of the Take It Down Act, aligning it with her long-standing “Be Best” initiative focused on children’s well-being and online safety. At the bill’s signing ceremony, she delivered remarks condemning the non-consensual sharing of intimate images as “malicious” and added her signature alongside the President’s.
Key provisions of the bill:
- The law makes it a federal offense to “knowingly” share or threaten to share intimate images without consent, encompassing both real and AI-generated content.
- Online platforms are mandated to remove reported non-consensual intimate images within 48 hours of a victim’s request and take steps to eliminate duplicates.
- The Federal Trade Commission (FTC) is empowered to enforce compliance, with violators facing fines up to $50,000 per incident and potential imprisonment of up to three years.
The legislation responds to growing concerns over the misuse of AI technologies to create realistic, non-consensual explicit content. Last year, at Westfield High School, in New Jersey, more than two dozen students were horrified to learn AI-generated pornographic images of them were circulating online. Alarm grew in January when violent and explicit images of superstar Taylor Swift spread rapidly on social media.
The criminal provisions of the Take It Down Act took immediate effect upon signing. Online platforms are granted a one-year period, until May 19, 2026, to establish compliant reporting and removal procedures, ensuring victims have accessible means to request the takedown of non-consensual content.
The act promises victims a streamlined process to reclaim their agency. But it also places the burden of action on the very people already traumatized by abuse. Victims must identify the content, submit personal information, and prove lack of consent, often while navigating psychological trauma or threats from perpetrators.
On top of that, the bill only applies to public-facing platforms, leaving out private, encrypted, and decentralized networks where abuse often begins and flourishes. For survivors, this means the worst material may remain beyond reach.
The Take it Down Act is a great first step.
But while it establishes a long-overdue framework, it remains a reactive solution to a rapidly evolving crisis. True protection will require a shift in how our laws, platforms, and institutions prioritize the dignity, safety, and autonomy of those most at risk. Until then, survivors will continue to shoulder the burden of seeking justice in a system still catching up.