diff --git a/wiki/docker/ollama_-_ollama.md b/wiki/docker/ollama_-_ollama.md new file mode 100644 index 0000000..477b110 --- /dev/null +++ b/wiki/docker/ollama_-_ollama.md @@ -0,0 +1,40 @@ +# ollama - ollama + +This is a [Docker](/wiki/docker.md) container for an ollama server. +The official container and documentation was made by +[ollama](https://hub.docker.com/r/ollama/ollama). + +## Set-up + +Create the file `rebuild.sh`. +Change the settings according to your needs and run `./rebuild.sh` afterwards. + +## Ports + +Set the following ports in the `ports:` section. + +| Container Port | Recommended outside port | Protocol | Description | +| -------------- | ------------------------ | --------- | ------------------- | +| `11434` | `11434` | TCP | Communications port | + +## Volumes + +Set the following volumes with the -v tag. + +| Outside mount/volume name | Container mount | Description | +| ------------------------- | --------------- | ------------- | +| `ollama` | `/root/.ollama` | Ollama data | + +## rebuild.sh + +```sh +#!/bin/sh +docker stop ollama +docker rm ollama +docker pull ollama/ollama +docker run --name ollama \ + --restart unless-stopped \ + -p 11434:11434 \ + -v ollama:/root/.ollama \ + -d ollama/ollama +``` diff --git a/wiki/docker/open-webui_-_open-webui.md b/wiki/docker/open-webui_-_open-webui.md new file mode 100644 index 0000000..8995c93 --- /dev/null +++ b/wiki/docker/open-webui_-_open-webui.md @@ -0,0 +1,41 @@ +# open-webui - open-webui + +This is a [Docker](/wiki/docker.md) container for a Open WebUI server. +The official container and documentation was made by +[open-webui](https://github.com/open-webui/open-webui). + +## Set-up + +Create the file `rebuild.sh`. +Change the settings according to your needs and run `./rebuild.sh` afterwards. + +## Ports + +Set the following ports in the `ports:` section. + +| Container Port | Recommended outside port | Protocol | Description | +| -------------- | ------------------------ | --------- | ------------- | +| `11434` | `8080` | TCP | WebUI | + +## Volumes + +Set the following volumes with the -v tag. + +| Outside mount/volume name | Container mount | Description | +| ------------------------- | ------------------- | --------------- | +| `open-webui` | `/app/backend/data` | Open WebUI data | + +## rebuild.sh + +```sh +#!/bin/sh +docker stop openwebui +docker rm openwebui +docker pull ghcr.io/open-webui/open-webui:main +docker run --name openwebui \ + --restart unless-stopped \ + -p 11434:8080 \ + -v open-webui:/app/backend/data \ + -d ghcr.io/open-webui/open-webui:main + +``` diff --git a/wiki/open_webui.md b/wiki/open_webui.md new file mode 100644 index 0000000..2eda20a --- /dev/null +++ b/wiki/open_webui.md @@ -0,0 +1,40 @@ +# Open WebUI + +[Open WebUI](https://openwebui.com/) is a self-hostable artifical intelligence interface that +allows different workflows and even offline operation. + +## Setup + +The software can be setup via [Docker](/wiki/docker.md) with the +[open-webui image](/wiki/docker/open-webui_-_open-webui.md). + +Additionally a provider for the artificial intelligence is needed. +This can be done by using Ollama which can be setup via docker with the +[ollama image](/wiki/docker/ollama_-_ollama.md). +When using this option the address and port for Ollama has to be set in the admin settings of +WebUI. + +Alternatively to a self-hosted Ollama also the official ChatGPT service can be linked to WebUI. +This also has to be set up in the admin settings. + +## Usage + +### Downloading and Selecting New Models + +Models are not downloaded via the Open Webui directly. +In fact they are managed by the provider of the AI. +When using Ollama as described in [the setup section](#setup) the following commands can be used to +list available networks. + +```sh +ollama list +``` + +Afterwards a model can be selected and pulled by using the following command. `` is the name +of the model obtained in the previous step (for example `deepseek-r1`). + +```sh +ollama pull +``` + +After the downloading of the model it has to be set in the admin settings of WebUI.