Only been using it for a couple of days, but at times it gets unresponsive and it’s when I switch to a network with Internet access. Then I turn the WiFI off and it gives me an error about trying to access a “resource.”
I have a feeling this thing is logging all activity and reporting it somewhere. Which just makes sense in our world. Free software just isn’t free.
Anybody know of a good way to keep this thing actually local? Is it some rule in a router firewall that is required?
OFFLINE_MODE=True
ENABLE_OPENAI_API=False
Will get rid of most online activity I’m pretty sure.
2 Likes
Are those Docker install parameter options? Where do I even find these? Also, hi @felixthecat
1 Like
I installed mine in a Docker container. My first time ever “using” Docker although I have no idea what I’m doing. I can cut and paste from the Internet like a champ though.
It’s also really weird, I just connected to Open WebUI over my LAN from a different computer and while sometimes it gives me the same speed as it would if I were sitting at the host computer, most of the time it’s super slow and yet I can hear the GPU cranking on the host machine. Surely passing the prompt and getting the response back isn’t causing that, so not sure what’s going on.
@felixthecat, thanks for the response, I will try that.
1 Like
It’s probably just checking if you’re on the interner because there’s a toggle to allow models to look for stuff online.
2 Likes
Yes, they are envirobmental variables for docker.
I found them in a github issue a few weeks ago, not sure where they are documented.
And hi to you too!
1 Like
On the host machine I’m getting 48.5 token/sec for response and 5000 token/sec for prompt.
On client machine on the same network, I’m getting 1 token/sec for response and 92 token/sec for prompt.
But I hear the host machine GPU cranking.
No idea how to fix that. If I have a chat going on with the client computer with one user, a different user on the host computer has to wait, so I would imagine both browser sessions are using the host computer’s resources so I don’t understand why the client is so slow.
Edit: just did an apt update and now both are fast. I dunno.
Edit Edit: It’s happening again and I confirmed that the client machine requests are not triggering GPU usage on the host machine hence the slowness.
And I still get the occasional “TypeError: NetworkError when attempting to fetch resource” in my LLM chat despite having Web Search and Web Loader Settings disabled. So it’s using the Internet for something.