@[email protected] to [email protected] • 4 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up189arrow-down115
arrow-up174arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org@[email protected] to [email protected] • 4 months agomessage-square19fedilink
minus-square@[email protected]linkfedilink5•4 months agoPersonally I’d just recommend either Alpaca or GPT4All, both of which are on Flathub and much easier to set up (or at least GPT4All is; I haven’t tested Alpaca yet).
minus-squareAshleylinkfedilink2•4 months agoAlpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b
Personally I’d just recommend either Alpaca or GPT4All, both of which are on Flathub and much easier to set up (or at least GPT4All is; I haven’t tested Alpaca yet).
Alpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b