This is the app I use for running AI locally on my laptop. It has ollama built in although I found that running ollama in podman is slightly faster.
1 Like
Looks like a nice all-in-one package to run local AI.
What specs you’re running models on? What models?
Just found Alpaca yesterday too. Works super well on Pop OS 22.04, AMD plugin works super well, no need to install rocm, it will just use your GPU just fine.
Not sure what the speed is, but it seems fairly fast to my AI noob ass using Deepseek v2 16B.
Interesting, the multimodal image and pdf support raised an eye brow from me. I haven’t had a lot of time to play around with multimodal models