• wonko7@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    that’s so cool, thanks!

    I think I’m gonna need new hardware if I want to play with this…

    • LionyxML@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Nice!

      Ollama runs in CPU mode if you don’t have a GPU, also there are models that can be used with 4Gb of RAM.

      Things won’t be fast but for experimenting it is enought :)