• Turun@feddit.de
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    5 months ago

    Yes it is intentional.

    Some interferences even expose a way to set the “temperature” - higher values of that mean more randomized (feels creative) output, lower values mean less randomness. A temperature of 0 will make the model deterministic.

    • istanbullu@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      even at 0 temperature the model will not be deterministic, because it depends on the seed used as well as things like numerical noise.

      • Turun@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 months ago

        Yeah no, that’s not how this works.

        Where in the process does that seed play a role and what do you even mean with numerical noise?

        Edit: I feel like I should add that I am very interested in learning more. If you can provide me with any sources to show that GPTs are inherently random I am happy to eat my own hat.