They frame it as though it’s for user content, more likely it’s to train AI, but in fact it gives them the right to do almost anything they want - up to (but not including) stealing the content outright.

  • mods_are_assholes@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    4
    ·
    9 months ago

    Most of those laws are unenforcable and some are even undetectable.

    Your ideology is getting in the way of objective fact.

      • mods_are_assholes@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        9 months ago

        The only way to make a clear text LLM is to convert most of the hard storage that humanity produces for the next ten years into storage, and we’d need about 1/4 the processing power of bitcoin mining to have it run at ChatGPT speeds.

        Even said, blackbox self-modifying AIs will be the models that win the usefulness wars, and if one country outlaws them then the only result is they will have no defense against countries that don’t feel the need to comply with them.

        • xor@infosec.pub
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          9 months ago

          so, your first paragraph isn’t true. but i’ll point out that bitcoin is mined with ASIC chips entirely now, which only hash bitcoin transactions… they can’t compute anything else so it’s not really comparable…

          second part i do agree with except for self-modifying… although that doesn’t seem too far away…

          • mods_are_assholes@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            9 months ago

            You really don’t understand how LLM data blobs are created, do you? Nor do you understand how ridiculously compressed it is?