ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women’s Hospital found that cancer treatment plans generated by OpenAI’s revolutionary chatbot were full of errors.

  • Uncaged_Jay@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    1 year ago

    “Hey, program that is basically just regurgitating information, how do we do this incredibly complex things that even we don’t understand yet?”

    “Here ya go.”

    “Wow, this is wrong.”

    “No shit.”

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      1 year ago

      “Be aware that ChatGPT may produce wrong or inaccurate results, what is your question?”

      How beat cancer

      wrong, inaccurate information

      😱

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      It reminds me of some particularly badly written episodes of Star Trek where they use the holodeck to simulate some weird scenario involving exotic physics nobody understands, and it works perfectly.