forked from open-webui/open-webui
refac: OLLAMA_API_BASE_URL deprecated
This commit is contained in:
parent
5ae121b087
commit
f741adc6c9
12 changed files with 32 additions and 33 deletions
|
@ -1,6 +1,6 @@
|
||||||
# Ollama URL for the backend to connect
|
# Ollama URL for the backend to connect
|
||||||
# The path '/ollama/api' will be redirected to the specified backend URL
|
# The path '/ollama' will be redirected to the specified backend URL
|
||||||
OLLAMA_API_BASE_URL='http://localhost:11434/api'
|
OLLAMA_BASE_URL='http://localhost:11434'
|
||||||
|
|
||||||
OPENAI_API_BASE_URL=''
|
OPENAI_API_BASE_URL=''
|
||||||
OPENAI_API_KEY=''
|
OPENAI_API_KEY=''
|
||||||
|
|
|
@ -20,7 +20,7 @@ FROM python:3.11-slim-bookworm as base
|
||||||
ENV ENV=prod
|
ENV ENV=prod
|
||||||
ENV PORT ""
|
ENV PORT ""
|
||||||
|
|
||||||
ENV OLLAMA_API_BASE_URL "/ollama/api"
|
ENV OLLAMA_BASE_URL "/ollama"
|
||||||
|
|
||||||
ENV OPENAI_API_BASE_URL ""
|
ENV OPENAI_API_BASE_URL ""
|
||||||
ENV OPENAI_API_KEY ""
|
ENV OPENAI_API_KEY ""
|
||||||
|
|
|
@ -95,10 +95,10 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
|
||||||
|
|
||||||
- **If Ollama is on a Different Server**, use this command:
|
- **If Ollama is on a Different Server**, use this command:
|
||||||
|
|
||||||
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
- To connect to Ollama on another server, change the `OLLAMA_BASE_URL` to the server's URL:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
||||||
|
@ -110,7 +110,7 @@ If you're experiencing connection issues, it’s often due to the WebUI docker c
|
||||||
**Example Docker Command**:
|
**Example Docker Command**:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
### Other Installation Methods
|
### Other Installation Methods
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
|
|
||||||
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
||||||
|
|
||||||
- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama/api` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_API_BASE_URL` environment variable. Therefore, a request made to `/ollama/api` in the WebUI is effectively the same as making a request to `OLLAMA_API_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_API_BASE_URL/tags` in the backend.
|
- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_BASE_URL` environment variable. Therefore, a request made to `/ollama` in the WebUI is effectively the same as making a request to `OLLAMA_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_BASE_URL/api/tags` in the backend.
|
||||||
|
|
||||||
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
||||||
|
|
||||||
|
@ -15,7 +15,7 @@ If you're experiencing connection issues, it’s often due to the WebUI docker c
|
||||||
**Example Docker Command**:
|
**Example Docker Command**:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
### General Connection Errors
|
### General Connection Errors
|
||||||
|
@ -25,8 +25,8 @@ docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_
|
||||||
**Troubleshooting Steps**:
|
**Troubleshooting Steps**:
|
||||||
|
|
||||||
1. **Verify Ollama URL Format**:
|
1. **Verify Ollama URL Format**:
|
||||||
- When running the Web UI container, ensure the `OLLAMA_API_BASE_URL` is correctly set, including the `/api` suffix. (e.g., `http://192.168.1.1:11434/api` for different host setups).
|
- When running the Web UI container, ensure the `OLLAMA_BASE_URL` is correctly set. (e.g., `http://192.168.1.1:11434` for different host setups).
|
||||||
- In the Open WebUI, navigate to "Settings" > "General".
|
- In the Open WebUI, navigate to "Settings" > "General".
|
||||||
- Confirm that the Ollama Server URL is correctly set to `[OLLAMA URL]/api` (e.g., `http://localhost:11434/api`), including the `/api` suffix.
|
- Confirm that the Ollama Server URL is correctly set to `[OLLAMA URL]` (e.g., `http://localhost:11434`).
|
||||||
|
|
||||||
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
|
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
|
||||||
|
|
|
@ -15,7 +15,7 @@ import asyncio
|
||||||
from apps.web.models.users import Users
|
from apps.web.models.users import Users
|
||||||
from constants import ERROR_MESSAGES
|
from constants import ERROR_MESSAGES
|
||||||
from utils.utils import decode_token, get_current_user, get_admin_user
|
from utils.utils import decode_token, get_current_user, get_admin_user
|
||||||
from config import OLLAMA_BASE_URL, WEBUI_AUTH
|
from config import OLLAMA_BASE_URLS
|
||||||
|
|
||||||
from typing import Optional, List, Union
|
from typing import Optional, List, Union
|
||||||
|
|
||||||
|
@ -29,8 +29,7 @@ app.add_middleware(
|
||||||
allow_headers=["*"],
|
allow_headers=["*"],
|
||||||
)
|
)
|
||||||
|
|
||||||
app.state.OLLAMA_BASE_URL = OLLAMA_BASE_URL
|
app.state.OLLAMA_BASE_URLS = OLLAMA_BASE_URLS
|
||||||
app.state.OLLAMA_BASE_URLS = [OLLAMA_BASE_URL]
|
|
||||||
app.state.MODELS = {}
|
app.state.MODELS = {}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -14,7 +14,7 @@ import json
|
||||||
from utils.utils import get_admin_user
|
from utils.utils import get_admin_user
|
||||||
from utils.misc import calculate_sha256, get_gravatar_url
|
from utils.misc import calculate_sha256, get_gravatar_url
|
||||||
|
|
||||||
from config import OLLAMA_API_BASE_URL, DATA_DIR, UPLOAD_DIR
|
from config import OLLAMA_BASE_URLS, DATA_DIR, UPLOAD_DIR
|
||||||
from constants import ERROR_MESSAGES
|
from constants import ERROR_MESSAGES
|
||||||
|
|
||||||
|
|
||||||
|
@ -75,7 +75,7 @@ async def download_file_stream(url, file_path, file_name, chunk_size=1024 * 1024
|
||||||
hashed = calculate_sha256(file)
|
hashed = calculate_sha256(file)
|
||||||
file.seek(0)
|
file.seek(0)
|
||||||
|
|
||||||
url = f"{OLLAMA_API_BASE_URL}/blobs/sha256:{hashed}"
|
url = f"{OLLAMA_BASE_URLS[0]}/blobs/sha256:{hashed}"
|
||||||
response = requests.post(url, data=file)
|
response = requests.post(url, data=file)
|
||||||
|
|
||||||
if response.ok:
|
if response.ok:
|
||||||
|
@ -147,7 +147,7 @@ def upload(file: UploadFile = File(...)):
|
||||||
hashed = calculate_sha256(f)
|
hashed = calculate_sha256(f)
|
||||||
f.seek(0)
|
f.seek(0)
|
||||||
|
|
||||||
url = f"{OLLAMA_API_BASE_URL}/blobs/sha256:{hashed}"
|
url = f"{OLLAMA_BASE_URLS[0]}/blobs/sha256:{hashed}"
|
||||||
response = requests.post(url, data=f)
|
response = requests.post(url, data=f)
|
||||||
|
|
||||||
if response.ok:
|
if response.ok:
|
||||||
|
|
|
@ -207,20 +207,25 @@ OLLAMA_API_BASE_URL = os.environ.get(
|
||||||
"OLLAMA_API_BASE_URL", "http://localhost:11434/api"
|
"OLLAMA_API_BASE_URL", "http://localhost:11434/api"
|
||||||
)
|
)
|
||||||
|
|
||||||
if ENV == "prod":
|
|
||||||
if OLLAMA_API_BASE_URL == "/ollama/api":
|
|
||||||
OLLAMA_API_BASE_URL = "http://host.docker.internal:11434/api"
|
|
||||||
|
|
||||||
|
|
||||||
OLLAMA_BASE_URL = os.environ.get("OLLAMA_BASE_URL", "")
|
OLLAMA_BASE_URL = os.environ.get("OLLAMA_BASE_URL", "")
|
||||||
|
|
||||||
if OLLAMA_BASE_URL == "":
|
if ENV == "prod":
|
||||||
|
if OLLAMA_BASE_URL == "/ollama":
|
||||||
|
OLLAMA_BASE_URL = "http://host.docker.internal:11434"
|
||||||
|
|
||||||
|
|
||||||
|
if OLLAMA_BASE_URL == "" and OLLAMA_API_BASE_URL != "":
|
||||||
OLLAMA_BASE_URL = (
|
OLLAMA_BASE_URL = (
|
||||||
OLLAMA_API_BASE_URL[:-4]
|
OLLAMA_API_BASE_URL[:-4]
|
||||||
if OLLAMA_API_BASE_URL.endswith("/api")
|
if OLLAMA_API_BASE_URL.endswith("/api")
|
||||||
else OLLAMA_API_BASE_URL
|
else OLLAMA_API_BASE_URL
|
||||||
)
|
)
|
||||||
|
|
||||||
|
OLLAMA_BASE_URLS = os.environ.get("OLLAMA_BASE_URLS", "")
|
||||||
|
OLLAMA_BASE_URLS = OLLAMA_BASE_URLS if OLLAMA_BASE_URLS != "" else OLLAMA_BASE_URL
|
||||||
|
|
||||||
|
OLLAMA_BASE_URLS = [url.strip() for url in OLLAMA_BASE_URLS.split(",")]
|
||||||
|
|
||||||
|
|
||||||
####################################
|
####################################
|
||||||
# OPENAI_API
|
# OPENAI_API
|
||||||
|
|
|
@ -14,7 +14,7 @@ services:
|
||||||
build:
|
build:
|
||||||
context: .
|
context: .
|
||||||
args:
|
args:
|
||||||
OLLAMA_API_BASE_URL: '/ollama/api'
|
OLLAMA_BASE_URL: '/ollama'
|
||||||
dockerfile: Dockerfile
|
dockerfile: Dockerfile
|
||||||
image: ghcr.io/open-webui/open-webui:main
|
image: ghcr.io/open-webui/open-webui:main
|
||||||
container_name: open-webui
|
container_name: open-webui
|
||||||
|
@ -25,7 +25,7 @@ services:
|
||||||
ports:
|
ports:
|
||||||
- ${OPEN_WEBUI_PORT-3000}:8080
|
- ${OPEN_WEBUI_PORT-3000}:8080
|
||||||
environment:
|
environment:
|
||||||
- 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
|
- 'OLLAMA_BASE_URL=http://ollama:11434'
|
||||||
- 'WEBUI_SECRET_KEY='
|
- 'WEBUI_SECRET_KEY='
|
||||||
extra_hosts:
|
extra_hosts:
|
||||||
- host.docker.internal:host-gateway
|
- host.docker.internal:host-gateway
|
||||||
|
|
|
@ -40,7 +40,7 @@ spec:
|
||||||
- name: data
|
- name: data
|
||||||
mountPath: /app/backend/data
|
mountPath: /app/backend/data
|
||||||
env:
|
env:
|
||||||
- name: OLLAMA_API_BASE_URL
|
- name: OLLAMA_BASE_URL
|
||||||
value: {{ include "ollama.url" . | quote }}
|
value: {{ include "ollama.url" . | quote }}
|
||||||
tty: true
|
tty: true
|
||||||
{{- with .Values.webui.nodeSelector }}
|
{{- with .Values.webui.nodeSelector }}
|
||||||
|
|
|
@ -26,8 +26,8 @@ spec:
|
||||||
cpu: "1000m"
|
cpu: "1000m"
|
||||||
memory: "1Gi"
|
memory: "1Gi"
|
||||||
env:
|
env:
|
||||||
- name: OLLAMA_API_BASE_URL
|
- name: OLLAMA_BASE_URL
|
||||||
value: "http://ollama-service.open-webui.svc.cluster.local:11434/api"
|
value: "http://ollama-service.open-webui.svc.cluster.local:11434"
|
||||||
tty: true
|
tty: true
|
||||||
volumeMounts:
|
volumeMounts:
|
||||||
- name: webui-volume
|
- name: webui-volume
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
{
|
{
|
||||||
"name": "open-webui",
|
"name": "open-webui",
|
||||||
"version": "0.1.108",
|
"version": "0.1.109",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "vite dev --host",
|
"dev": "vite dev --host",
|
||||||
|
|
|
@ -90,8 +90,3 @@ export const SUPPORTED_FILE_EXTENSIONS = [
|
||||||
// This feature, akin to $env/static/private, exclusively incorporates environment variables
|
// This feature, akin to $env/static/private, exclusively incorporates environment variables
|
||||||
// that are prefixed with config.kit.env.publicPrefix (usually set to PUBLIC_).
|
// that are prefixed with config.kit.env.publicPrefix (usually set to PUBLIC_).
|
||||||
// Consequently, these variables can be securely exposed to client-side code.
|
// Consequently, these variables can be securely exposed to client-side code.
|
||||||
|
|
||||||
// Example of the .env configuration:
|
|
||||||
// OLLAMA_API_BASE_URL="http://localhost:11434/api"
|
|
||||||
// # Public
|
|
||||||
// PUBLIC_API_BASE_URL=$OLLAMA_API_BASE_URL
|
|
||||||
|
|
Loading…
Reference in a new issue