Image-based sexual abuse
What it is
Sharing private sexual photos, videos, or information without permission — including images that are real or generated with AI (deepfakes). How they were obtained doesn't matter, and whether they're real doesn't matter: sharing them is harm. In many jurisdictions it's now a crime.
Does this sound familiar?
How it gets justified
“It's not that bad — everyone has photos like that.”
That intimate photos exist isn't permission to share them. What's at work is the destruction of privacy as a humiliation tool. The body reads this as a violation because it is one.
“You sent them to me, so I can do whatever I want with them.”
Consent to send isn't consent to share. This converts trust into a retroactive weapon. From here on, you know any vulnerability can be turned against you.
Related patterns
Something feels off but you can't name it?
An exercise to listen to what the body already knows.