A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • finley@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    11
    ·
    4 months ago

    It’s hard to believe someone is not a pedo when they advocate so strongly for child porn

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      4 months ago

      You’re just projecting your unwillingness to ever take a stance that doesn’t personally benefit you.

      Some people can think about things objectively and draw a conclusion that makes sense to them without personal benefit being a primary determinant of said conclusion.

      • finley@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        9
        ·
        4 months ago

        You’re just projecting your unwillingness to ever take a stance that doesn’t personally benefit you.

        I’m not the one here defending child porn

        • ObjectivityIncarnate@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          2
          ·
          edit-2
          4 months ago

          You’re arguing against a victimless outlet that there is significant evidence would reduce the incidence of actual child molestation.

          So let’s use your ‘logic’/argumentation: why are you against reducing child molestation? Why are you against fake pictures but not actual child molestation? Why do you want children to be molested?

          • finley@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            8
            ·
            edit-2
            4 months ago

            Your claim that it’s victimless is, of course, false since real children are used in the training data without consent. This also ignores the fact that the result is child porn, for which you are arguing in support of.

            Lastly, your claim that any of this results in any reduction in child abuse is spurious and unsubstantiated.

            • ObjectivityIncarnate@lemmy.world
              link
              fedilink
              arrow-up
              9
              arrow-down
              3
              ·
              edit-2
              4 months ago

              Your claim that it’s victimless is, of course, false since real children are used in the training data without consent.

              Your assumption, but there are a ton of royalty-free images that contain children out there, more than enough for an AI to ‘learn’ proportions etc. Combine with adult nudity, and a generative AI can ‘bridge the gap’ create images of people that don’t exist (hence the word “generative”).

              This also ignores the fact that the result is child porn

              That’s not a fact. “Child porn” requires a child–pixels on a screen depicting the likeness of a person, and a person that does not actually exist in the real world to boot, is not a child.

              Lastly, your claim that any of this results in any reduction in child abuse is spurious and unsubstantiated.

              I’m just making a reasonable guess based on what’s been found about other things in the same subcategory (Japanese research found that those who have actually molested a kid were less likely to have consumed porn comics depicting that subject matter, than the general population), and in other sex categories, like how the prevalence of rape fantasy porn online correlates with a massive reduction of real-life rape.

              Seems pretty unlikely that this is going to be the one and only exception to date where a fictional facsimile doesn’t ‘satiate’ the urge to offend in real life, and instead encourages the ‘consumer’ to offend.

              • finley@lemm.ee
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                7
                ·
                edit-2
                4 months ago

                That’s just defending child porn with extra steps.

                Why do you keep defending child porn? And why would you be so willing to die on the hill of defending child porn unless you, yourself, were a consumer of it? Yet, it was you who accused me of projection…

                It’s the shamelessness of you and your position that I find most revolting, really.

                • gamermanh@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  2
                  ·
                  4 months ago

                  Not the guy you’re replying to, but maybe this’ll help you understand:

                  If AI art isn’t art, AI CSAM isn’t CSAM

                  • finley@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    4
                    ·
                    edit-2
                    4 months ago

                    If AI art isn’t art

                    I’m not making that argument, but thanks for the straw man

                    Also, FWIW, for this argument to make sense, CSAM would have to be considered art, which it isnt.

                    It amazes me the lengths that some people will go to to rationalize child porn