• ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    101
    ·
    17 hours ago

    That’s my issue with people saying stuff like “I can immediately tell when a picture is made with AI and I hate how they look”

    Your assesment doesn’t take into account all the false negatives. You have no idea how many pictures have tricked you already. By definition, the picture is badly made if you can immediately tell it’s AI. That’s a bit like seeing the most flamboyantly gay person on the street and thinking all gays look like that and you can always spot them while the closeted friend you’re with flies perfectly under the radar.

    • zarkanian@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 hour ago

      I recently saw a photo on some website. It was from a Trump rally, and people had these freaky, ecstatic looks on their faces. Somebody commented that it looked like AI. Other people soon agreed; one of them remarked on the bizarre, “alien” hand on one of the babies in the crowd. That hand did look weird. There were too few fingers. It looked like a Teenage Mutant Ninja Turtle hand.

      The problem was that this image was originally from a news story that was years prior to ChatGPT and the current AI boom. For this to be AI, the photographer would’ve had to have access to experimental software that was years away from being released to the public.

      Sometimes people just look weird and, sometimes, they have weird hands, too.

    • RQG@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      7 hours ago

      Reminds me of all the people who believe commercials and advertising doesn’t work on them. Sure, that’s why billions are spent on it. Because it doesn’t even do anything. Oh it only works on all the other people?

      That’s why it is so hard to get that stuff regulated. People believe it doesn’t work on them.

      • Donkter@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        That’s the real fear of AI. Not that it’s stealing art jobs or whatever. But that all it takes is for a politician or business man to claim something is AI, no matter how corroborated it is and throw the whole investigation for a loop. It’s not a thing now, because no one knows about advanced AI (except for internet bubbles) and it’s still thought that you can easily differentiate images, but imagine even 5 years from, or 10.

    • Johanno@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      12 hours ago

      Many unedited or using old Ai images I can detect with one look. A few more I can find by looking for inconsistencies like hands or illogical items.

      However I am sure there will be more AI generated images that may even be a little bit edited afterwards that I can’t detect.

      You will need an ai to detect them. Since at least in images ai is detectable by the way they create the files.

      • OneMeaningManyNames@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 hours ago

        In AI-generated sound you can see it in the waveform, it has less random noise altogether and it seems like a huge, well, wave. I wonder if sth similar is true for images.

        • hex@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 hours ago

          Basically yes, lack of detail, especially small things like hair or fingers. The texture/definition in AI images is usually less. Though, once again, depends on the technique being used.