Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • @Whitebrow@lemmy.world
    link
    fedilink
    English
    23 months ago

    We’re still under the assumption that all of these definitions exist as outlined in the first reply, so going off that, you’re torturing the AI because it’s an AI. Sounds like a 1:1 match to me.

    • @Zozano@lemy.lolOP
      link
      fedilink
      English
      33 months ago

      In the example the sadist is torturing the AI because it’s convenient and safe, not because they hate the AI.

      If they wanted to hurt real people too, but couldn’t because they would get found, then it wouldn’t be a hate-crime.

      If I was torturing a Korean because a Korean was the only one who responded to my All-You-Can-Eat-Tteok-Bokki-In-My-Basement flier, then I would be torturing them because they’re Korean, but it wouldn’t be a hate-crime because I’m not doing it because I hate Koreans.