Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • bionicjoey@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    9 months ago

    No, because anyone who knows what a Nazi is and trusts that the person giving them instructions is not insane can assume that the first directive is meant to be a general note for their future work and not to be applied to the second directive. If one wanted pictures of racially diverse Nazis, they would need to be more explicit.

    • EatATaco@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      9 months ago

      the first directive is meant to be a general note for their future work and not to be applied to the second directive

      This is the root question, which you just gloss over. Why? It’s a general note, why should one assume it doesn’t apply? You seem to be saying “it applies except when it doesn’t.” It would seem to be that the rational thing to do would be to assume that the general note applies unless you’re explicitly told otherwise, or there is some good reason to believe this wasn’t the intent.

      Also, fyi, the request was for German soldiers, not nazis.

      And don’t get me wrong, I agree with you that it should not generate black German soldiers from 1939 without being explicitly told to do so. But I think this is a problem with it’s directives rather than evidence that it’s not thinking.