petsoi@discuss.tchncs.de to Linux@lemmy.ml · 2 days agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up182arrow-down115
arrow-up167arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi@discuss.tchncs.de to Linux@lemmy.ml · 2 days agomessage-square19fedilink
minus-squaretheshatterstone54linkfedilinkarrow-up5·2 days agoPersonally I’d just recommend either Alpaca or GPT4All, both of which are on Flathub and much easier to set up (or at least GPT4All is; I haven’t tested Alpaca yet).
minus-squareAshley@lemmy.calinkfedilinkarrow-up2·19 hours agoAlpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b
Personally I’d just recommend either Alpaca or GPT4All, both of which are on Flathub and much easier to set up (or at least GPT4All is; I haven’t tested Alpaca yet).
Alpaca is great, I can even run it on my oneplus 6t, albeit slowly and the max size I got running was llama 7b