Understanding your situation
What you need to prepare
- ✓Evidence of the deepfake: URLs, screenshots (with timestamps), downloaded copies if possible
- ✓Proof that the content is fabricated: original unmanipulated versions, alibi evidence
- ✓Platform where the content is published: name, URL, any response to takedown requests
- ✓Identity of the perpetrator (if known): name, account, or any identifying information
- ✓Timeline: when you first became aware of the deepfake
- ✓Evidence of harm: emotional distress, financial loss, reputational damage, screenshots of distribution
- ✓Your identification (to prove you are the person depicted)
- ✓Police report number (if you have already reported to law enforcement)
⏰ Deadline
Act immediately. For platform takedowns, most platforms have expedited review for non-consensual intimate imagery and deepfakes. For criminal complaints, statutes of limitation vary but are typically measured in years. For GDPR complaints, act promptly.
🏛️ Authority
Law enforcement: National police (criminal complaint for non-consensual imagery, fraud, harassment, defamation). Platform: Report using their reporting tools. National DPA: for GDPR violations (processing biometric data without consent, Article 9). National AI competent authority: for AI Act transparency violations (from August 2026). Cybercrime units: Europol EC3, national cybercrime divisions.
⚖️ Legal basis
EU AI Act Article 50(4): from 2 August 2026, deployers of deepfake AI systems must disclose that content has been artificially generated or manipulated. AI Act Article 99: penalties for transparency violations will include fines up to EUR 7.5 million or 1.5% of global turnover. GDPR Article 9: processing of biometric data requires explicit consent. GDPR Articles 17 and 79: right to erasure and right to judicial remedy. Digital Services Act (DSA): platform obligations for illegal content. Italy Law 132/2025: specific criminal offense for unlawful deepfake dissemination (1-5 years imprisonment). National laws on defamation, harassment, non-consensual intimate imagery, identity fraud.
Expert tips
- 1Preserve all evidence immediately. Take screenshots with timestamps, download content, archive URLs using the Wayback Machine. Digital evidence disappears quickly once reported.
- 2Report to the hosting platform first - most have expedited processes for deepfake takedowns, especially for non-consensual intimate imagery.
- 3File a police report, especially if the deepfake involves intimate imagery, fraud, threats, or extortion. In many jurisdictions, creating or distributing certain deepfakes is already a criminal offense.
- 4Send a GDPR Article 17 erasure request to any organization hosting the content. Processing your biometric data without consent violates GDPR Article 9.
- 5From 2 August 2026 under current law, AI Act Article 50 obligations will become enforceable. Deployers who fail to disclose AI-generated content will face fines up to EUR 7.5 million or 1.5% of turnover.
- 6If you are a victim of deepfake sextortion, do not pay. Report to law enforcement and the platform immediately.
- 7Consider consulting the Revenge Porn Helpline (UK) or equivalent national helplines for support.
