Source: site

The FTC has put tech companies on notice regarding compliance with the Take It Down Act (TIDA), a new federal law that became effective May 19, 2025, requiring platforms to remove nonconsensual intimate images including AI-generated deepfakes. The law’s platform requirements become enforceable on May 19, 2026, giving covered platforms one year to establish compliant removal processes.
Platform Requirements
Covered platforms must establish a clear and conspicuous notice-and-removal process that allows victims to request removal of intimate photos or videos shared without their consent. Upon receiving a valid request, platforms must remove the content and make reasonable efforts to identify and remove identical copies within 48 hours. The law applies to publicly available websites, apps, and platforms that host user-generated content or regularly make such imagery available.
Scope of Coverage
TIDA covers both authentic intimate visual depictions and “digital forgeries” created using software, apps, or artificial intelligence. The law prohibits using interactive computer services to knowingly publish intimate images of identifiable individuals without consent, whether the images are real or AI-generated. Notably, prior consent to create an image or share it with another person does not constitute consent for its publication.
FTC Enforcement Authority
The FTC enforces Section 3 of TIDA, which establishes the platform removal requirements. Noncompliance is treated as an unfair or deceptive act under Section 5 of the FTC Act, exposing platforms to civil penalties, injunctive relief, mandated changes, and ongoing compliance monitoring. Significantly, the law extends FTC enforcement authority to nonprofit organizations, which are typically outside the FTC’s jurisdiction.
Implementation Challenges
The 48-hour removal requirement and obligation to remove copies of reported images may demand significant operational resources and new technological infrastructure, particularly for platforms with limited moderation capabilities. End-to-end encrypted services face uncertainty about the scope of their obligations since Congress did not explicitly exclude them from coverage or limit the Act’s applicability to shared or publicly communicated content.




