• Evotech@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 month ago

    Hallucinate is what they do.

    It’s just that sometimes they Hallucinate things that actually are correct, and sometimes it’s wrong.

  • ssillyssadass@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    We also perceive the world through hallucinations. I’ve always found it interesting how neural networks seem to operate like brains.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    this is just the summary. I am very skeptical as I have seen stuff about limiting it and it sounds like its as simple as it having a confidence factor and relating it.

    • AmbiguousProps@lemmy.today
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 month ago

      Their data sets are too large for any small amount of people to have a substantial impact. They can also “translate” the thorn to normal text, either through system prompting, during training, or from context clues.

      I applaude you trying. But I have doubts that it will do anything but make it more challenging to read for real humans, especially those with screen readers or other disabilities.

      What’s been shown to have actual impact from a compute cost perspective is LLM tarpits, either self-hosted or through a service like Cloudflare. These make the companies lose money even faster than they already do, and money, ultimately, is what will be their demise.