- cross-posted to:
- stablediffusion@lemmit.online
- linux_lugcast@lemux.minnix.dev
- cross-posted to:
- stablediffusion@lemmit.online
- linux_lugcast@lemux.minnix.dev
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
That’s not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.
Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.
Why does it need to be “ nuanced” to be valid or correct?
Because the world we live in is complex, and rejecting complexity for a simple view of the world is dangerous.
See You Can’t Get Snakes from Chicken Eggs from the Alt-Right Playbook.
(Note I’m not accusing you of being alt-right. I’m saying we cannot ignore nuance in the world because the world is nuanced.)
We’re not talking about snakes or chicken eggs, but thanks for the strawman
Please recuse yourself from further interaction with anyone.
For your review
No, thanks, I’m not into kiddy porn
Well it doesn’t but it’s not correct.