Is someone checking your browsing?
This website will appear in your browser history. If you're concerned someone may be monitoring your internet use, consider using a trusted friend's device, a library computer, or your browser's private/incognito mode. You can press Quick Exit or hit Escape at any time to leave this site quickly.
Learn more about staying safe online
Someone has created fake intimate images of you using AI
Using artificial intelligence, someone has taken your ordinary photos — from social media, work, or everyday life — and generated fake nude or intimate images of you. You never took these photos. They were created without your knowledge or consent.
What You Might Notice
Intimate images of you appear that you never took
If intimate images exist that you didn't create and weren't taken of you, they may be AI-generated from your public photos.
The images look almost real but something is slightly off
AI-generated images may have subtle artefacts — unusual skin texture, inconsistent lighting, odd backgrounds.
What You Can Do
Report to the eSafety Commissioner immediately
The eSafety Commissioner can issue removal notices to platforms hosting the images. esafety.gov.au
Report to police
Creating and distributing intimate images without consent is a criminal offence in Australia, regardless of whether the images are 'real.'
Don't try to find or view the images yourself
Searching for them can be traumatising and may increase their visibility in search results.
Important: This resource provides general information, not personal advice. Every situation is different. The actions suggested here may not be safe in your specific circumstances — particularly if the person causing harm could notice changes to your devices or accounts. Always consider your physical safety first.
If you need personalised support, contact 1800RESPECT (1800 737 732) or your local specialist domestic violence service. If you are in immediate danger, call 000.
Taking non-intimate photos (social media photos, selfies, professional headshots) and using AI to generate nude or intimate versions. The victim never took or consented to an intimate image — the intimate image was fabricated from a clothed photo. Growing legal gap as some jurisdictions' laws require an 'original intimate image.'
Mitigations for this technique are under development. If you have suggestions on how to improve this content, please submit a pattern.
The TFA Matrix is a research framework under active development. Technique classifications, detection methods, and mitigations reflect current understanding and are subject to revision. This framework does not constitute forensic methodology, legal evidence standards, or clinical diagnostic criteria. Practitioners should apply professional judgement appropriate to their discipline and jurisdiction.