• @mwguy
    link
    English
    23 months ago

    This AI isn’t a LLM.

    • @intrepid@lemmy.ca
      link
      fedilink
      English
      43 months ago

      Doesn’t mean that it won’t hallucinate. Or whatever you call an AI making up crap.

      • @mwguy
        link
        English
        03 months ago

        LLM’s hallucinate all the time. The hallucination is the feature. Depending on how you design the neural network you can get an AI that doesn’t hallucinate. LLM’s have to do that, because they’re mimicking human speech patterns and predicting one of my possible responses.

        A model that tries to predict locations of people likely wouldn’t work like that.

      • @mwguy
        link
        English
        13 months ago

        I mean, it probably has a neural network component.

      • @mwguy
        link
        English
        13 months ago

        The primary feature of LLM’s is the hallucination.