Hi guys, I was installing and testing some LLMs on a bunch of different systems the otherday and I came up with this install script to get openwebGUI installed on a POP_OS (debian) system really quick and painfree. SO I thought I would share it below for anyone else looking to get setup. Once this is installed you can just download the models you want to try and away you go.
For anyone wanting to take their own personal AI with them, I have also had a little success installing this on a virtual machine which also has a cloudlfared tunnel setup meaning I can host the server at home on my own hardware and ready it via top level domain quickly and eaisly. I will post my cloudflared config file below as well for anyone looking at doing that. Be warned, I am not 100% on how secure pointing a public web address into a server behind your own firewall is, so be careful. Also set a strong password on your openwebGUI server!
Install script -
#!/bin/bash
# Update and upgrade the system packages
sudo apt update && sudo apt upgrade -y
# Remove existing Docker and container-related packages
for pkg in docker.io docker-doc docker-compose docker-compose-v2 podman-docker containerd runc; do
sudo apt-get remove $pkg -y
done
# Install necessary packages for fetching files over HTTPS
sudo apt install apt-transport-https ca-certificates curl -y
# Add Docker’s official GPG key
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
# Add Docker repository
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Update the apt package index
sudo apt update
# Install Docker Engine, CLI, and containerd
sudo apt install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin -y
# Install additional software via a custom script
curl -fsSL https://ollama.com/install.sh | sh
# Run a Docker container for open-webui
sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
cloudflared tunnel config yml file
tunnel: #paste your tunnel ID here
credentials-file: #path to your credentials file default is /home/user/.cloudflared/"tunnelID".json
ingress:
- hostname: URL you own
services: http://your servers IP
originRequest:
noTLSVerify: true
- service: http_status:404