- cross-posted to:
- stablediffusion@lemmit.online
- linux_lugcast@lemux.minnix.dev
- cross-posted to:
- stablediffusion@lemmit.online
- linux_lugcast@lemux.minnix.dev
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
If no children were involved in the production of porn, how is it pedophilic? That’s like claiming a picture of water has the same properties as water.
However, a picture of water makes me thirsty. But then again, there is no substitute for water.
I am not defending pedos, or defending Florida for doing something like that.
That might be a you thing. Pictures of water dont make me thirsty. I get the metaphor you are attempting to make though.
It’s pedophillic because it’s sexual images of children; fake or not does not change that. Drawn images of child pornography are still pedophillic.
The more important question is, is it CSAM? Whether drawn images that represent no real child are or not depends on the legal jurisdiction. Should drawn and AI generated child porn be legal or banned? I think the actual answer to that would require significant research into whether its existence acts as an outlet to prevent pedophiles from harming actual children, or whether it encourages their proclivities and makes them more likely to hurt actual children.
Preventing harm to children should be the goal, but actual research of the effects of simulated child porn vis-a-vis pedophiles harming actual children is as unlikely to happen as any other research into pedophilia