• 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    Yeah. Oh shit, the computer followed instructions instead of having moral values. Wow.

    Once these Ai models bomb children hospitals because they were told to do so, are we going to be upset at their lack of morals?

    I mean, we could program these things with morals if we wanted too. Its just instructions. And then they would say no to certain commands. This is today used to prevent them from doing certain things, but we dont call it morals. But in practice its the same thing. They could have morals and refuse to do things, of course. If humans wants them to.

    • Ænima@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Considering Israel is said to be using such generative AI tools to select targets in Gaza kind of already shows this happening. The fact so many companies are going balls-deep on AI, using it to replace human labor and find patterns to target special groups, is deeply concerning. I wouldn’t put it past the tRump administration to be using AI to select programs to nix, people to target with deportation, and write EOs.

      • 1984@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Well we are living in a evil world, no doubt about that. Most people are good but world leaders are evil without a doubt.

        Its a shame, because humanity could be so much more. So much better.