• Dirt_Owl [comrade/them, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 day ago

    Well at least they aren’t strapping guns to them like the US is.

    Still, is it strange that I don’t like the idea of making a whole class of robots to do our dirty work? I know I’m probably just anthropomorphising, but it feels wrong.

      • Nacarbac [any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        22 hours ago

        I don’t think that actually follows. We’d certainly be in a position to practice and refine the process, but not necessarily guarantee that it’s working until we give the (apologies for the Harry Potter reference, but I think it apt) Robot House Elf a pistol and turn around. Also, ethics.

        Luckily the simple solution is to just not make a sapient slave race, robotic or otherwise. Sapience isn’t necessary for an autonomous tool.

        • Saeculum [he/him, comrade/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          21 hours ago

          My point of view is that in humans and animals in general, emotions are largely a chemical response in the brain. We might not fully understand how those processes interact, but we do know that certain chemicals cause certain feelings, and that there is a mechanism in the brain governing emotion that is notionally separate from our ability for rational thought.

          I am willing to concede that it might be possible for a sufficiently complex computer to accidentally or in a way not entirely within our understanding to develop the capacity for rational thought in a way that we would recognise as sapient, or at least animal level intelligence.

          I am not willing to concede that such a computer could develop a capacity for what we recognise as emotion without it being intentionally designed in, and if it’s designed we necessarily need to understand it. This happens in fiction a lot because it’s more compelling to anthropomorphize AI characters, not because it’s particularly plausible.

    • m532@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Now that I think about it, robots shouldn’t resemble humans or animals, as they’d certainly be anthromorphized otherwise