Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy.”

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      ·
      8 hours ago

      Bold of you to assume there was any testing process involved beyond “does it run? ship it”

    • Cenotaph@mander.xyz
      link
      fedilink
      English
      arrow-up
      12
      ·
      8 hours ago

      You’d think. If I was the one paying for it, I would be changing providers but you know how that goes. I just work here, man.