- cross-posted to:
- stablediffusion@lemmit.online
- linux_lugcast@lemux.minnix.dev
- cross-posted to:
- stablediffusion@lemmit.online
- linux_lugcast@lemux.minnix.dev
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.
No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam
That’s patently false.
I’m not going to continue to entertain this discussion but instead I’m just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post’s discusion. Enjoy.
Also if you’d like to see how the corn dog comment is absurd and wrong. Go look up my comment.
Sure thing bud. Sure thing 🙄