1
0
mirror of https://github.com/tiyn/wiki.git synced 2025-10-09 09:27:53 +02:00

docker: added ollama and webui setup and shared page

This commit is contained in:
tiyn 2025-09-22 01:05:47 +02:00
parent c0e3170619
commit 7476364f83
3 changed files with 121 additions and 0 deletions

View File

@ -0,0 +1,40 @@
# ollama - ollama
This is a [Docker](/wiki/docker.md) container for an ollama server.
The official container and documentation was made by
[ollama](https://hub.docker.com/r/ollama/ollama).
## Set-up
Create the file `rebuild.sh`.
Change the settings according to your needs and run `./rebuild.sh` afterwards.
## Ports
Set the following ports in the `ports:` section.
| Container Port | Recommended outside port | Protocol | Description |
| -------------- | ------------------------ | --------- | ------------------- |
| `11434` | `11434` | TCP | Communications port |
## Volumes
Set the following volumes with the -v tag.
| Outside mount/volume name | Container mount | Description |
| ------------------------- | --------------- | ------------- |
| `ollama` | `/root/.ollama` | Ollama data |
## rebuild.sh
```sh
#!/bin/sh
docker stop ollama
docker rm ollama
docker pull ollama/ollama
docker run --name ollama \
--restart unless-stopped \
-p 11434:11434 \
-v ollama:/root/.ollama \
-d ollama/ollama
```

View File

@ -0,0 +1,41 @@
# open-webui - open-webui
This is a [Docker](/wiki/docker.md) container for a Open WebUI server.
The official container and documentation was made by
[open-webui](https://github.com/open-webui/open-webui).
## Set-up
Create the file `rebuild.sh`.
Change the settings according to your needs and run `./rebuild.sh` afterwards.
## Ports
Set the following ports in the `ports:` section.
| Container Port | Recommended outside port | Protocol | Description |
| -------------- | ------------------------ | --------- | ------------- |
| `11434` | `8080` | TCP | WebUI |
## Volumes
Set the following volumes with the -v tag.
| Outside mount/volume name | Container mount | Description |
| ------------------------- | ------------------- | --------------- |
| `open-webui` | `/app/backend/data` | Open WebUI data |
## rebuild.sh
```sh
#!/bin/sh
docker stop openwebui
docker rm openwebui
docker pull ghcr.io/open-webui/open-webui:main
docker run --name openwebui \
--restart unless-stopped \
-p 11434:8080 \
-v open-webui:/app/backend/data \
-d ghcr.io/open-webui/open-webui:main
```

40
wiki/open_webui.md Normal file
View File

@ -0,0 +1,40 @@
# Open WebUI
[Open WebUI](https://openwebui.com/) is a self-hostable artifical intelligence interface that
allows different workflows and even offline operation.
## Setup
The software can be setup via [Docker](/wiki/docker.md) with the
[open-webui image](/wiki/docker/open-webui_-_open-webui.md).
Additionally a provider for the artificial intelligence is needed.
This can be done by using Ollama which can be setup via docker with the
[ollama image](/wiki/docker/ollama_-_ollama.md).
When using this option the address and port for Ollama has to be set in the admin settings of
WebUI.
Alternatively to a self-hosted Ollama also the official ChatGPT service can be linked to WebUI.
This also has to be set up in the admin settings.
## Usage
### Downloading and Selecting New Models
Models are not downloaded via the Open Webui directly.
In fact they are managed by the provider of the AI.
When using Ollama as described in [the setup section](#setup) the following commands can be used to
list available networks.
```sh
ollama list
```
Afterwards a model can be selected and pulled by using the following command. `<model>` is the name
of the model obtained in the previous step (for example `deepseek-r1`).
```sh
ollama pull <model>
```
After the downloading of the model it has to be set in the admin settings of WebUI.