Skip to content

Image-based sexual abuse

What it is

Sharing private sexual photos, videos, or information without permission — including images that are real or generated with AI (deepfakes). How they were obtained doesn't matter, and whether they're real doesn't matter: sharing them is harm. In many jurisdictions it's now a crime.

Does this sound familiar?

Sending intimate photos to friends "as a joke".
Posting private content after a breakup.
Generating fake images with AI using your face.
"You sent them to me, I do what I want with them."

How it gets justified

It's not that bad — everyone has photos like that.

That intimate photos exist isn't permission to share them. What's at work is the destruction of privacy as a humiliation tool. The body reads this as a violation because it is one.

You sent them to me, so I can do whatever I want with them.

Consent to send isn't consent to share. This converts trust into a retroactive weapon. From here on, you know any vulnerability can be turned against you.

Related patterns

Something feels off but you can't name it?

An exercise to listen to what the body already knows.