• samus12345@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 months ago

    Oh, I see, you’re saying they can bypass “injure” and go straight to “kill”. Killing someone still qualifies as injuring them - ever heard the term “fatally injured”? So no, it wouldn’t be within the rules.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      I think he’s referring to the absolutism of the programmatic “or” statement.

      The robot would interpret (cannot cause harm to humanity) or (through inaction allow harm to come to humanity). If either statement is true, then the rule is satisfied.

      By taking action in harming humans to death, the robot made true the second statement satisfying the rule as “followed”.

      While our meat brains can work out the meaning of the phrase, the computer would take it very literally and therefore, death to all humans!

      Furthermore, if a human comes to harm, they may have violated the second half of the first rule, but since the robot didn’t cause harm to the person, the first statement is true, therefore, death to all humans!