• LionyxML@alien.topOPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Nice!

    Ollama runs in CPU mode if you don’t have a GPU, also there are models that can be used with 4Gb of RAM.

    Things won’t be fast but for experimenting it is enought :)