forked from open-webui/open-webui
Merge pull request #380 from ollama-webui/connection-url
feat: enable backend ollama url update
This commit is contained in:
commit
31fcb9d6fc
14 changed files with 528 additions and 418 deletions
|
@ -2,12 +2,6 @@
|
||||||
|
|
||||||
FROM node:alpine as build
|
FROM node:alpine as build
|
||||||
|
|
||||||
ARG OLLAMA_API_BASE_URL='/ollama/api'
|
|
||||||
RUN echo $OLLAMA_API_BASE_URL
|
|
||||||
|
|
||||||
ENV PUBLIC_API_BASE_URL $OLLAMA_API_BASE_URL
|
|
||||||
RUN echo $PUBLIC_API_BASE_URL
|
|
||||||
|
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|
||||||
COPY package.json package-lock.json ./
|
COPY package.json package-lock.json ./
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
|
|
||||||
The Ollama WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
The Ollama WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
||||||
|
|
||||||
- **How it Works**: When you make a request (like `/ollama/api/tags`) from the Ollama WebUI, it doesn’t go directly to the Ollama API. Instead, it first reaches the Ollama WebUI backend. The backend then forwards this request to the Ollama API via the route you define in the `OLLAMA_API_BASE_URL` environment variable. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_API_BASE_URL/tags` in the backend.
|
- **How it Works**: The Ollama WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Ollama WebUI backend via `/ollama/api` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_API_BASE_URL` environment variable. Therefore, a request made to `/ollama/api` in the WebUI is effectively the same as making a request to `OLLAMA_API_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_API_BASE_URL/tags` in the backend.
|
||||||
|
|
||||||
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
||||||
|
|
||||||
|
@ -27,6 +27,6 @@ docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BAS
|
||||||
1. **Verify Ollama URL Format**:
|
1. **Verify Ollama URL Format**:
|
||||||
- When running the Web UI container, ensure the `OLLAMA_API_BASE_URL` is correctly set, including the `/api` suffix. (e.g., `http://192.168.1.1:11434/api` for different host setups).
|
- When running the Web UI container, ensure the `OLLAMA_API_BASE_URL` is correctly set, including the `/api` suffix. (e.g., `http://192.168.1.1:11434/api` for different host setups).
|
||||||
- In the Ollama WebUI, navigate to "Settings" > "General".
|
- In the Ollama WebUI, navigate to "Settings" > "General".
|
||||||
- Confirm that the Ollama Server URL is correctly set to `/ollama/api`, including the `/api` suffix.
|
- Confirm that the Ollama Server URL is correctly set to `[OLLAMA URL]/api` (e.g., `http://localhost:11434/api`), including the `/api` suffix.
|
||||||
|
|
||||||
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
|
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
|
||||||
|
|
|
@ -1,69 +1,68 @@
|
||||||
from flask import Flask, request, Response, jsonify
|
from fastapi import FastAPI, Request, Response, HTTPException, Depends
|
||||||
from flask_cors import CORS
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from fastapi.responses import StreamingResponse
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
import json
|
import json
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
from apps.web.models.users import Users
|
from apps.web.models.users import Users
|
||||||
from constants import ERROR_MESSAGES
|
from constants import ERROR_MESSAGES
|
||||||
from utils.utils import decode_token
|
from utils.utils import decode_token, get_current_user
|
||||||
from config import OLLAMA_API_BASE_URL, WEBUI_AUTH
|
from config import OLLAMA_API_BASE_URL, WEBUI_AUTH
|
||||||
|
|
||||||
app = Flask(__name__)
|
app = FastAPI()
|
||||||
CORS(
|
app.add_middleware(
|
||||||
app
|
CORSMiddleware,
|
||||||
) # Enable Cross-Origin Resource Sharing (CORS) to allow requests from different domains
|
allow_origins=["*"],
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
# Define the target server URL
|
app.state.OLLAMA_API_BASE_URL = OLLAMA_API_BASE_URL
|
||||||
TARGET_SERVER_URL = OLLAMA_API_BASE_URL
|
|
||||||
|
# TARGET_SERVER_URL = OLLAMA_API_BASE_URL
|
||||||
|
|
||||||
|
|
||||||
@app.route("/", defaults={"path": ""}, methods=["GET", "POST", "PUT", "DELETE"])
|
@app.get("/url")
|
||||||
@app.route("/<path:path>", methods=["GET", "POST", "PUT", "DELETE"])
|
async def get_ollama_api_url(user=Depends(get_current_user)):
|
||||||
def proxy(path):
|
if user and user.role == "admin":
|
||||||
# Combine the base URL of the target server with the requested path
|
return {"OLLAMA_API_BASE_URL": app.state.OLLAMA_API_BASE_URL}
|
||||||
target_url = f"{TARGET_SERVER_URL}/{path}"
|
else:
|
||||||
print(target_url)
|
raise HTTPException(status_code=401, detail=ERROR_MESSAGES.ACCESS_PROHIBITED)
|
||||||
|
|
||||||
# Get data from the original request
|
|
||||||
data = request.get_data()
|
class UrlUpdateForm(BaseModel):
|
||||||
|
url: str
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/url/update")
|
||||||
|
async def update_ollama_api_url(
|
||||||
|
form_data: UrlUpdateForm, user=Depends(get_current_user)
|
||||||
|
):
|
||||||
|
if user and user.role == "admin":
|
||||||
|
app.state.OLLAMA_API_BASE_URL = form_data.url
|
||||||
|
return {"OLLAMA_API_BASE_URL": app.state.OLLAMA_API_BASE_URL}
|
||||||
|
else:
|
||||||
|
raise HTTPException(status_code=401, detail=ERROR_MESSAGES.ACCESS_PROHIBITED)
|
||||||
|
|
||||||
|
|
||||||
|
@app.api_route("/{path:path}", methods=["GET", "POST", "PUT", "DELETE"])
|
||||||
|
async def proxy(path: str, request: Request, user=Depends(get_current_user)):
|
||||||
|
target_url = f"{app.state.OLLAMA_API_BASE_URL}/{path}"
|
||||||
|
|
||||||
|
body = await request.body()
|
||||||
headers = dict(request.headers)
|
headers = dict(request.headers)
|
||||||
|
|
||||||
# Basic RBAC support
|
|
||||||
if WEBUI_AUTH:
|
|
||||||
if "Authorization" in headers:
|
|
||||||
_, credentials = headers["Authorization"].split()
|
|
||||||
token_data = decode_token(credentials)
|
|
||||||
if token_data is None or "email" not in token_data:
|
|
||||||
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
|
||||||
|
|
||||||
user = Users.get_user_by_email(token_data["email"])
|
|
||||||
if user:
|
|
||||||
# Only user and admin roles can access
|
|
||||||
if user.role in ["user", "admin"]:
|
if user.role in ["user", "admin"]:
|
||||||
if path in ["pull", "delete", "push", "copy", "create"]:
|
if path in ["pull", "delete", "push", "copy", "create"]:
|
||||||
# Only admin role can perform actions above
|
if user.role != "admin":
|
||||||
if user.role == "admin":
|
raise HTTPException(
|
||||||
pass
|
status_code=401, detail=ERROR_MESSAGES.ACCESS_PROHIBITED
|
||||||
else:
|
|
||||||
return (
|
|
||||||
jsonify({"detail": ERROR_MESSAGES.ACCESS_PROHIBITED}),
|
|
||||||
401,
|
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
pass
|
raise HTTPException(status_code=401, detail=ERROR_MESSAGES.ACCESS_PROHIBITED)
|
||||||
else:
|
|
||||||
return jsonify({"detail": ERROR_MESSAGES.ACCESS_PROHIBITED}), 401
|
|
||||||
else:
|
|
||||||
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
|
||||||
else:
|
|
||||||
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
|
||||||
else:
|
|
||||||
pass
|
|
||||||
|
|
||||||
r = None
|
|
||||||
|
|
||||||
headers.pop("Host", None)
|
headers.pop("Host", None)
|
||||||
headers.pop("Authorization", None)
|
headers.pop("Authorization", None)
|
||||||
|
@ -71,49 +70,30 @@ def proxy(path):
|
||||||
headers.pop("Referer", None)
|
headers.pop("Referer", None)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Make a request to the target server
|
|
||||||
r = requests.request(
|
r = requests.request(
|
||||||
method=request.method,
|
method=request.method,
|
||||||
url=target_url,
|
url=target_url,
|
||||||
data=data,
|
data=body,
|
||||||
headers=headers,
|
headers=headers,
|
||||||
stream=True, # Enable streaming for server-sent events
|
stream=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
|
|
||||||
# Proxy the target server's response to the client
|
return StreamingResponse(
|
||||||
def generate():
|
r.iter_content(chunk_size=8192),
|
||||||
for chunk in r.iter_content(chunk_size=8192):
|
status_code=r.status_code,
|
||||||
yield chunk
|
headers=dict(r.headers),
|
||||||
|
)
|
||||||
response = Response(generate(), status=r.status_code)
|
|
||||||
|
|
||||||
# Copy headers from the target server's response to the client's response
|
|
||||||
for key, value in r.headers.items():
|
|
||||||
response.headers[key] = value
|
|
||||||
|
|
||||||
return response
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(e)
|
print(e)
|
||||||
error_detail = "Ollama WebUI: Server Connection Error"
|
error_detail = "Ollama WebUI: Server Connection Error"
|
||||||
if r != None:
|
if r is not None:
|
||||||
print(r.text)
|
try:
|
||||||
res = r.json()
|
res = r.json()
|
||||||
if "error" in res:
|
if "error" in res:
|
||||||
error_detail = f"Ollama: {res['error']}"
|
error_detail = f"Ollama: {res['error']}"
|
||||||
print(res)
|
except:
|
||||||
|
error_detail = f"Ollama: {e}"
|
||||||
|
|
||||||
return (
|
raise HTTPException(status_code=r.status_code, detail=error_detail)
|
||||||
jsonify(
|
|
||||||
{
|
|
||||||
"detail": error_detail,
|
|
||||||
"message": str(e),
|
|
||||||
}
|
|
||||||
),
|
|
||||||
400,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
app.run(debug=True)
|
|
||||||
|
|
176
backend/apps/ollama/old_main.py
Normal file
176
backend/apps/ollama/old_main.py
Normal file
|
@ -0,0 +1,176 @@
|
||||||
|
from flask import Flask, request, Response, jsonify
|
||||||
|
from flask_cors import CORS
|
||||||
|
|
||||||
|
|
||||||
|
import requests
|
||||||
|
import json
|
||||||
|
|
||||||
|
|
||||||
|
from apps.web.models.users import Users
|
||||||
|
from constants import ERROR_MESSAGES
|
||||||
|
from utils.utils import decode_token
|
||||||
|
from config import OLLAMA_API_BASE_URL, WEBUI_AUTH
|
||||||
|
|
||||||
|
app = Flask(__name__)
|
||||||
|
CORS(
|
||||||
|
app
|
||||||
|
) # Enable Cross-Origin Resource Sharing (CORS) to allow requests from different domains
|
||||||
|
|
||||||
|
# Define the target server URL
|
||||||
|
TARGET_SERVER_URL = OLLAMA_API_BASE_URL
|
||||||
|
|
||||||
|
|
||||||
|
@app.route("/url", methods=["GET"])
|
||||||
|
def get_ollama_api_url():
|
||||||
|
headers = dict(request.headers)
|
||||||
|
if "Authorization" in headers:
|
||||||
|
_, credentials = headers["Authorization"].split()
|
||||||
|
token_data = decode_token(credentials)
|
||||||
|
if token_data is None or "email" not in token_data:
|
||||||
|
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
||||||
|
|
||||||
|
user = Users.get_user_by_email(token_data["email"])
|
||||||
|
if user and user.role == "admin":
|
||||||
|
return (
|
||||||
|
jsonify({"OLLAMA_API_BASE_URL": TARGET_SERVER_URL}),
|
||||||
|
200,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return (
|
||||||
|
jsonify({"detail": ERROR_MESSAGES.ACCESS_PROHIBITED}),
|
||||||
|
401,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return (
|
||||||
|
jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}),
|
||||||
|
401,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.route("/url/update", methods=["POST"])
|
||||||
|
def update_ollama_api_url():
|
||||||
|
headers = dict(request.headers)
|
||||||
|
data = request.get_json(force=True)
|
||||||
|
|
||||||
|
if "Authorization" in headers:
|
||||||
|
_, credentials = headers["Authorization"].split()
|
||||||
|
token_data = decode_token(credentials)
|
||||||
|
if token_data is None or "email" not in token_data:
|
||||||
|
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
||||||
|
|
||||||
|
user = Users.get_user_by_email(token_data["email"])
|
||||||
|
if user and user.role == "admin":
|
||||||
|
TARGET_SERVER_URL = data["url"]
|
||||||
|
return (
|
||||||
|
jsonify({"OLLAMA_API_BASE_URL": TARGET_SERVER_URL}),
|
||||||
|
200,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return (
|
||||||
|
jsonify({"detail": ERROR_MESSAGES.ACCESS_PROHIBITED}),
|
||||||
|
401,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return (
|
||||||
|
jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}),
|
||||||
|
401,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.route("/", defaults={"path": ""}, methods=["GET", "POST", "PUT", "DELETE"])
|
||||||
|
@app.route("/<path:path>", methods=["GET", "POST", "PUT", "DELETE"])
|
||||||
|
def proxy(path):
|
||||||
|
# Combine the base URL of the target server with the requested path
|
||||||
|
target_url = f"{TARGET_SERVER_URL}/{path}"
|
||||||
|
print(target_url)
|
||||||
|
|
||||||
|
# Get data from the original request
|
||||||
|
data = request.get_data()
|
||||||
|
headers = dict(request.headers)
|
||||||
|
|
||||||
|
# Basic RBAC support
|
||||||
|
if WEBUI_AUTH:
|
||||||
|
if "Authorization" in headers:
|
||||||
|
_, credentials = headers["Authorization"].split()
|
||||||
|
token_data = decode_token(credentials)
|
||||||
|
if token_data is None or "email" not in token_data:
|
||||||
|
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
||||||
|
|
||||||
|
user = Users.get_user_by_email(token_data["email"])
|
||||||
|
if user:
|
||||||
|
# Only user and admin roles can access
|
||||||
|
if user.role in ["user", "admin"]:
|
||||||
|
if path in ["pull", "delete", "push", "copy", "create"]:
|
||||||
|
# Only admin role can perform actions above
|
||||||
|
if user.role == "admin":
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
return (
|
||||||
|
jsonify({"detail": ERROR_MESSAGES.ACCESS_PROHIBITED}),
|
||||||
|
401,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
return jsonify({"detail": ERROR_MESSAGES.ACCESS_PROHIBITED}), 401
|
||||||
|
else:
|
||||||
|
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
||||||
|
else:
|
||||||
|
return jsonify({"detail": ERROR_MESSAGES.UNAUTHORIZED}), 401
|
||||||
|
else:
|
||||||
|
pass
|
||||||
|
|
||||||
|
r = None
|
||||||
|
|
||||||
|
headers.pop("Host", None)
|
||||||
|
headers.pop("Authorization", None)
|
||||||
|
headers.pop("Origin", None)
|
||||||
|
headers.pop("Referer", None)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Make a request to the target server
|
||||||
|
r = requests.request(
|
||||||
|
method=request.method,
|
||||||
|
url=target_url,
|
||||||
|
data=data,
|
||||||
|
headers=headers,
|
||||||
|
stream=True, # Enable streaming for server-sent events
|
||||||
|
)
|
||||||
|
|
||||||
|
r.raise_for_status()
|
||||||
|
|
||||||
|
# Proxy the target server's response to the client
|
||||||
|
def generate():
|
||||||
|
for chunk in r.iter_content(chunk_size=8192):
|
||||||
|
yield chunk
|
||||||
|
|
||||||
|
response = Response(generate(), status=r.status_code)
|
||||||
|
|
||||||
|
# Copy headers from the target server's response to the client's response
|
||||||
|
for key, value in r.headers.items():
|
||||||
|
response.headers[key] = value
|
||||||
|
|
||||||
|
return response
|
||||||
|
except Exception as e:
|
||||||
|
print(e)
|
||||||
|
error_detail = "Ollama WebUI: Server Connection Error"
|
||||||
|
if r != None:
|
||||||
|
print(r.text)
|
||||||
|
res = r.json()
|
||||||
|
if "error" in res:
|
||||||
|
error_detail = f"Ollama: {res['error']}"
|
||||||
|
print(res)
|
||||||
|
|
||||||
|
return (
|
||||||
|
jsonify(
|
||||||
|
{
|
||||||
|
"detail": error_detail,
|
||||||
|
"message": str(e),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
400,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
app.run(debug=True)
|
|
@ -46,5 +46,7 @@ async def check_url(request: Request, call_next):
|
||||||
|
|
||||||
|
|
||||||
app.mount("/api/v1", webui_app)
|
app.mount("/api/v1", webui_app)
|
||||||
app.mount("/ollama/api", WSGIMiddleware(ollama_app))
|
# app.mount("/ollama/api", WSGIMiddleware(ollama_app))
|
||||||
|
app.mount("/ollama/api", ollama_app)
|
||||||
|
|
||||||
app.mount("/", SPAStaticFiles(directory="../build", html=True), name="spa-static-files")
|
app.mount("/", SPAStaticFiles(directory="../build", html=True), name="spa-static-files")
|
||||||
|
|
|
@ -1,12 +1,76 @@
|
||||||
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
||||||
|
|
||||||
export const getOllamaVersion = async (
|
export const getOllamaAPIUrl = async (token: string = '') => {
|
||||||
base_url: string = OLLAMA_API_BASE_URL,
|
|
||||||
token: string = ''
|
|
||||||
) => {
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${base_url}/version`, {
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/url`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
...(token && { authorization: `Bearer ${token}` })
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
console.log(err);
|
||||||
|
if ('detail' in err) {
|
||||||
|
error = err.detail;
|
||||||
|
} else {
|
||||||
|
error = 'Server connection failed';
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res.OLLAMA_API_BASE_URL;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const updateOllamaAPIUrl = async (token: string = '', url: string) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/url/update`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
...(token && { authorization: `Bearer ${token}` })
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
url: url
|
||||||
|
})
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
console.log(err);
|
||||||
|
if ('detail' in err) {
|
||||||
|
error = err.detail;
|
||||||
|
} else {
|
||||||
|
error = 'Server connection failed';
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res.OLLAMA_API_BASE_URL;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const getOllamaVersion = async (token: string = '') => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/version`, {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
Accept: 'application/json',
|
Accept: 'application/json',
|
||||||
|
@ -35,13 +99,10 @@ export const getOllamaVersion = async (
|
||||||
return res?.version ?? '';
|
return res?.version ?? '';
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getOllamaModels = async (
|
export const getOllamaModels = async (token: string = '') => {
|
||||||
base_url: string = OLLAMA_API_BASE_URL,
|
|
||||||
token: string = ''
|
|
||||||
) => {
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${base_url}/tags`, {
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/tags`, {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
Accept: 'application/json',
|
Accept: 'application/json',
|
||||||
|
@ -72,15 +133,10 @@ export const getOllamaModels = async (
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
export const generateTitle = async (
|
export const generateTitle = async (token: string = '', model: string, prompt: string) => {
|
||||||
base_url: string = OLLAMA_API_BASE_URL,
|
|
||||||
token: string = '',
|
|
||||||
model: string,
|
|
||||||
prompt: string
|
|
||||||
) => {
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${base_url}/generate`, {
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/generate`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'text/event-stream',
|
'Content-Type': 'text/event-stream',
|
||||||
|
@ -111,14 +167,10 @@ export const generateTitle = async (
|
||||||
return res?.response ?? 'New Chat';
|
return res?.response ?? 'New Chat';
|
||||||
};
|
};
|
||||||
|
|
||||||
export const generateChatCompletion = async (
|
export const generateChatCompletion = async (token: string = '', body: object) => {
|
||||||
base_url: string = OLLAMA_API_BASE_URL,
|
|
||||||
token: string = '',
|
|
||||||
body: object
|
|
||||||
) => {
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${base_url}/chat`, {
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/chat`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'text/event-stream',
|
'Content-Type': 'text/event-stream',
|
||||||
|
@ -137,15 +189,10 @@ export const generateChatCompletion = async (
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const createModel = async (
|
export const createModel = async (token: string, tagName: string, content: string) => {
|
||||||
base_url: string = OLLAMA_API_BASE_URL,
|
|
||||||
token: string,
|
|
||||||
tagName: string,
|
|
||||||
content: string
|
|
||||||
) => {
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${base_url}/create`, {
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/create`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'text/event-stream',
|
'Content-Type': 'text/event-stream',
|
||||||
|
@ -167,14 +214,10 @@ export const createModel = async (
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const deleteModel = async (
|
export const deleteModel = async (token: string, tagName: string) => {
|
||||||
base_url: string = OLLAMA_API_BASE_URL,
|
|
||||||
token: string,
|
|
||||||
tagName: string
|
|
||||||
) => {
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${base_url}/delete`, {
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/delete`, {
|
||||||
method: 'DELETE',
|
method: 'DELETE',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'text/event-stream',
|
'Content-Type': 'text/event-stream',
|
||||||
|
@ -204,3 +247,27 @@ export const deleteModel = async (
|
||||||
|
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const pullModel = async (token: string, tagName: string) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${OLLAMA_API_BASE_URL}/pull`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/event-stream',
|
||||||
|
Authorization: `Bearer ${token}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
name: tagName
|
||||||
|
})
|
||||||
|
}).catch((err) => {
|
||||||
|
error = err;
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
|
@ -7,19 +7,23 @@
|
||||||
import { config, models, settings, user, chats } from '$lib/stores';
|
import { config, models, settings, user, chats } from '$lib/stores';
|
||||||
import { splitStream, getGravatarURL } from '$lib/utils';
|
import { splitStream, getGravatarURL } from '$lib/utils';
|
||||||
|
|
||||||
import { getOllamaVersion, getOllamaModels } from '$lib/apis/ollama';
|
|
||||||
import { createNewChat, deleteAllChats, getAllChats, getChatList } from '$lib/apis/chats';
|
|
||||||
import {
|
import {
|
||||||
WEB_UI_VERSION,
|
getOllamaVersion,
|
||||||
OLLAMA_API_BASE_URL,
|
getOllamaModels,
|
||||||
WEBUI_API_BASE_URL,
|
getOllamaAPIUrl,
|
||||||
WEBUI_BASE_URL
|
updateOllamaAPIUrl,
|
||||||
} from '$lib/constants';
|
pullModel,
|
||||||
|
createModel,
|
||||||
|
deleteModel
|
||||||
|
} from '$lib/apis/ollama';
|
||||||
|
import { createNewChat, deleteAllChats, getAllChats, getChatList } from '$lib/apis/chats';
|
||||||
|
import { WEB_UI_VERSION, WEBUI_API_BASE_URL } from '$lib/constants';
|
||||||
|
|
||||||
import Advanced from './Settings/Advanced.svelte';
|
import Advanced from './Settings/Advanced.svelte';
|
||||||
import Modal from '../common/Modal.svelte';
|
import Modal from '../common/Modal.svelte';
|
||||||
import { updateUserPassword } from '$lib/apis/auths';
|
import { updateUserPassword } from '$lib/apis/auths';
|
||||||
import { goto } from '$app/navigation';
|
import { goto } from '$app/navigation';
|
||||||
|
import Page from '../../../routes/(app)/+page.svelte';
|
||||||
|
|
||||||
export let show = false;
|
export let show = false;
|
||||||
|
|
||||||
|
@ -33,7 +37,7 @@
|
||||||
let selectedTab = 'general';
|
let selectedTab = 'general';
|
||||||
|
|
||||||
// General
|
// General
|
||||||
let API_BASE_URL = OLLAMA_API_BASE_URL;
|
let API_BASE_URL = '';
|
||||||
let themes = ['dark', 'light', 'rose-pine dark', 'rose-pine-dawn light'];
|
let themes = ['dark', 'light', 'rose-pine dark', 'rose-pine-dawn light'];
|
||||||
let theme = 'dark';
|
let theme = 'dark';
|
||||||
let notificationEnabled = false;
|
let notificationEnabled = false;
|
||||||
|
@ -139,19 +143,13 @@
|
||||||
// About
|
// About
|
||||||
let ollamaVersion = '';
|
let ollamaVersion = '';
|
||||||
|
|
||||||
const checkOllamaConnection = async () => {
|
const updateOllamaAPIUrlHandler = async () => {
|
||||||
if (API_BASE_URL === '') {
|
API_BASE_URL = await updateOllamaAPIUrl(localStorage.token, API_BASE_URL);
|
||||||
API_BASE_URL = OLLAMA_API_BASE_URL;
|
const _models = await getModels('ollama');
|
||||||
}
|
|
||||||
const _models = await getModels(API_BASE_URL, 'ollama');
|
|
||||||
|
|
||||||
if (_models.length > 0) {
|
if (_models.length > 0) {
|
||||||
toast.success('Server connection verified');
|
toast.success('Server connection verified');
|
||||||
await models.set(_models);
|
await models.set(_models);
|
||||||
|
|
||||||
saveSettings({
|
|
||||||
API_BASE_URL: API_BASE_URL
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
@ -229,18 +227,10 @@
|
||||||
|
|
||||||
const pullModelHandler = async () => {
|
const pullModelHandler = async () => {
|
||||||
modelTransferring = true;
|
modelTransferring = true;
|
||||||
const res = await fetch(`${API_BASE_URL}/pull`, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: {
|
|
||||||
'Content-Type': 'text/event-stream',
|
|
||||||
...($settings.authHeader && { Authorization: $settings.authHeader }),
|
|
||||||
...($user && { Authorization: `Bearer ${localStorage.token}` })
|
|
||||||
},
|
|
||||||
body: JSON.stringify({
|
|
||||||
name: modelTag
|
|
||||||
})
|
|
||||||
});
|
|
||||||
|
|
||||||
|
const res = await pullModel(localStorage.token, modelTag);
|
||||||
|
|
||||||
|
if (res) {
|
||||||
const reader = res.body
|
const reader = res.body
|
||||||
.pipeThrough(new TextDecoderStream())
|
.pipeThrough(new TextDecoderStream())
|
||||||
.pipeThrough(splitStream('\n'))
|
.pipeThrough(splitStream('\n'))
|
||||||
|
@ -292,6 +282,7 @@
|
||||||
toast.error(error);
|
toast.error(error);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
modelTag = '';
|
modelTag = '';
|
||||||
modelTransferring = false;
|
modelTransferring = false;
|
||||||
|
@ -410,21 +401,11 @@
|
||||||
}
|
}
|
||||||
|
|
||||||
if (uploaded) {
|
if (uploaded) {
|
||||||
const res = await fetch(`${$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL}/create`, {
|
const res = await createModel(
|
||||||
method: 'POST',
|
localStorage.token,
|
||||||
headers: {
|
`${name}:latest`,
|
||||||
'Content-Type': 'text/event-stream',
|
`FROM @${modelFileDigest}\n${modelFileContent}`
|
||||||
...($settings.authHeader && { Authorization: $settings.authHeader }),
|
);
|
||||||
...($user && { Authorization: `Bearer ${localStorage.token}` })
|
|
||||||
},
|
|
||||||
body: JSON.stringify({
|
|
||||||
name: `${name}:latest`,
|
|
||||||
modelfile: `FROM @${modelFileDigest}\n${modelFileContent}`
|
|
||||||
})
|
|
||||||
}).catch((err) => {
|
|
||||||
console.log(err);
|
|
||||||
return null;
|
|
||||||
});
|
|
||||||
|
|
||||||
if (res && res.ok) {
|
if (res && res.ok) {
|
||||||
const reader = res.body
|
const reader = res.body
|
||||||
|
@ -490,66 +471,22 @@
|
||||||
};
|
};
|
||||||
|
|
||||||
const deleteModelHandler = async () => {
|
const deleteModelHandler = async () => {
|
||||||
const res = await fetch(`${API_BASE_URL}/delete`, {
|
const res = await deleteModel(localStorage.token, deleteModelTag).catch((error) => {
|
||||||
method: 'DELETE',
|
toast.error(error);
|
||||||
headers: {
|
|
||||||
'Content-Type': 'text/event-stream',
|
|
||||||
...($settings.authHeader && { Authorization: $settings.authHeader }),
|
|
||||||
...($user && { Authorization: `Bearer ${localStorage.token}` })
|
|
||||||
},
|
|
||||||
body: JSON.stringify({
|
|
||||||
name: deleteModelTag
|
|
||||||
})
|
|
||||||
});
|
});
|
||||||
|
|
||||||
const reader = res.body
|
if (res) {
|
||||||
.pipeThrough(new TextDecoderStream())
|
|
||||||
.pipeThrough(splitStream('\n'))
|
|
||||||
.getReader();
|
|
||||||
|
|
||||||
while (true) {
|
|
||||||
const { value, done } = await reader.read();
|
|
||||||
if (done) break;
|
|
||||||
|
|
||||||
try {
|
|
||||||
let lines = value.split('\n');
|
|
||||||
|
|
||||||
for (const line of lines) {
|
|
||||||
if (line !== '' && line !== 'null') {
|
|
||||||
console.log(line);
|
|
||||||
let data = JSON.parse(line);
|
|
||||||
console.log(data);
|
|
||||||
|
|
||||||
if (data.error) {
|
|
||||||
throw data.error;
|
|
||||||
}
|
|
||||||
if (data.detail) {
|
|
||||||
throw data.detail;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (data.status) {
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
toast.success(`Deleted ${deleteModelTag}`);
|
toast.success(`Deleted ${deleteModelTag}`);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.log(error);
|
|
||||||
toast.error(error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
deleteModelTag = '';
|
deleteModelTag = '';
|
||||||
models.set(await getModels());
|
models.set(await getModels());
|
||||||
};
|
};
|
||||||
|
|
||||||
const getModels = async (url = '', type = 'all') => {
|
const getModels = async (type = 'all') => {
|
||||||
let models = [];
|
let models = [];
|
||||||
models.push(
|
models.push(
|
||||||
...(await getOllamaModels(
|
...(await getOllamaModels(localStorage.token).catch((error) => {
|
||||||
url ? url : $settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token
|
|
||||||
).catch((error) => {
|
|
||||||
toast.error(error);
|
toast.error(error);
|
||||||
return [];
|
return [];
|
||||||
}))
|
}))
|
||||||
|
@ -557,10 +494,10 @@
|
||||||
|
|
||||||
// If OpenAI API Key exists
|
// If OpenAI API Key exists
|
||||||
if (type === 'all' && $settings.OPENAI_API_KEY) {
|
if (type === 'all' && $settings.OPENAI_API_KEY) {
|
||||||
const API_BASE_URL = $settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1';
|
const OPENAI_API_BASE_URL = $settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1';
|
||||||
|
|
||||||
// Validate OPENAI_API_KEY
|
// Validate OPENAI_API_KEY
|
||||||
const openaiModelRes = await fetch(`${API_BASE_URL}/models`, {
|
const openaiModelRes = await fetch(`${OPENAI_API_BASE_URL}/models`, {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
|
@ -588,7 +525,7 @@
|
||||||
...openAIModels
|
...openAIModels
|
||||||
.map((model) => ({ name: model.id, external: true }))
|
.map((model) => ({ name: model.id, external: true }))
|
||||||
.filter((model) =>
|
.filter((model) =>
|
||||||
API_BASE_URL.includes('openai') ? model.name.includes('gpt') : true
|
OPENAI_API_BASE_URL.includes('openai') ? model.name.includes('gpt') : true
|
||||||
)
|
)
|
||||||
]
|
]
|
||||||
: [])
|
: [])
|
||||||
|
@ -624,15 +561,18 @@
|
||||||
};
|
};
|
||||||
|
|
||||||
onMount(async () => {
|
onMount(async () => {
|
||||||
|
console.log('settings', $user.role === 'admin');
|
||||||
|
if ($user.role === 'admin') {
|
||||||
|
API_BASE_URL = await getOllamaAPIUrl(localStorage.token);
|
||||||
|
}
|
||||||
|
|
||||||
let settings = JSON.parse(localStorage.getItem('settings') ?? '{}');
|
let settings = JSON.parse(localStorage.getItem('settings') ?? '{}');
|
||||||
console.log(settings);
|
console.log(settings);
|
||||||
|
|
||||||
theme = localStorage.theme ?? 'dark';
|
theme = localStorage.theme ?? 'dark';
|
||||||
notificationEnabled = settings.notificationEnabled ?? false;
|
notificationEnabled = settings.notificationEnabled ?? false;
|
||||||
|
|
||||||
API_BASE_URL = settings.API_BASE_URL ?? OLLAMA_API_BASE_URL;
|
|
||||||
system = settings.system ?? '';
|
system = settings.system ?? '';
|
||||||
|
|
||||||
requestFormat = settings.requestFormat ?? '';
|
requestFormat = settings.requestFormat ?? '';
|
||||||
|
|
||||||
options.seed = settings.seed ?? 0;
|
options.seed = settings.seed ?? 0;
|
||||||
|
@ -659,10 +599,7 @@
|
||||||
authContent = settings.authHeader.split(' ')[1];
|
authContent = settings.authHeader.split(' ')[1];
|
||||||
}
|
}
|
||||||
|
|
||||||
ollamaVersion = await getOllamaVersion(
|
ollamaVersion = await getOllamaVersion(localStorage.token).catch((error) => {
|
||||||
API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token
|
|
||||||
).catch((error) => {
|
|
||||||
return '';
|
return '';
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
@ -1026,6 +963,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{#if $user.role === 'admin'}
|
||||||
<hr class=" dark:border-gray-700" />
|
<hr class=" dark:border-gray-700" />
|
||||||
<div>
|
<div>
|
||||||
<div class=" mb-2.5 text-sm font-medium">Ollama API URL</div>
|
<div class=" mb-2.5 text-sm font-medium">Ollama API URL</div>
|
||||||
|
@ -1033,14 +971,14 @@
|
||||||
<div class="flex-1 mr-2">
|
<div class="flex-1 mr-2">
|
||||||
<input
|
<input
|
||||||
class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none"
|
class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none"
|
||||||
placeholder="Enter URL (e.g. http://localhost:8080/ollama/api)"
|
placeholder="Enter URL (e.g. http://localhost:11434/api)"
|
||||||
bind:value={API_BASE_URL}
|
bind:value={API_BASE_URL}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
<button
|
<button
|
||||||
class="px-3 bg-gray-200 hover:bg-gray-300 dark:bg-gray-600 dark:hover:bg-gray-700 rounded transition"
|
class="px-3 bg-gray-200 hover:bg-gray-300 dark:bg-gray-600 dark:hover:bg-gray-700 rounded transition"
|
||||||
on:click={() => {
|
on:click={() => {
|
||||||
checkOllamaConnection();
|
updateOllamaAPIUrlHandler();
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
<svg
|
<svg
|
||||||
|
@ -1059,11 +997,9 @@
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="mt-2 text-xs text-gray-400 dark:text-gray-500">
|
<div class="mt-2 text-xs text-gray-400 dark:text-gray-500">
|
||||||
The field above should be set to <span
|
Trouble accessing Ollama?
|
||||||
class=" text-gray-500 dark:text-gray-300 font-medium">'/ollama/api'</span
|
|
||||||
>;
|
|
||||||
<a
|
<a
|
||||||
class=" text-gray-500 dark:text-gray-300 font-medium"
|
class=" text-gray-300 font-medium"
|
||||||
href="https://github.com/ollama-webui/ollama-webui#troubleshooting"
|
href="https://github.com/ollama-webui/ollama-webui#troubleshooting"
|
||||||
target="_blank"
|
target="_blank"
|
||||||
>
|
>
|
||||||
|
@ -1071,6 +1007,7 @@
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
<hr class=" dark:border-gray-700" />
|
<hr class=" dark:border-gray-700" />
|
||||||
|
|
||||||
|
@ -1088,7 +1025,6 @@
|
||||||
class=" px-4 py-2 bg-emerald-600 hover:bg-emerald-700 text-gray-100 transition rounded"
|
class=" px-4 py-2 bg-emerald-600 hover:bg-emerald-700 text-gray-100 transition rounded"
|
||||||
on:click={() => {
|
on:click={() => {
|
||||||
saveSettings({
|
saveSettings({
|
||||||
API_BASE_URL: API_BASE_URL === '' ? OLLAMA_API_BASE_URL : API_BASE_URL,
|
|
||||||
system: system !== '' ? system : undefined
|
system: system !== '' ? system : undefined
|
||||||
});
|
});
|
||||||
show = false;
|
show = false;
|
||||||
|
|
|
@ -1,13 +1,8 @@
|
||||||
import { dev, browser } from '$app/environment';
|
import { dev } from '$app/environment';
|
||||||
import { PUBLIC_API_BASE_URL } from '$env/static/public';
|
|
||||||
|
|
||||||
export const OLLAMA_API_BASE_URL = dev
|
export const OLLAMA_API_BASE_URL = dev
|
||||||
? `http://${location.hostname}:8080/ollama/api`
|
? `http://${location.hostname}:8080/ollama/api`
|
||||||
: PUBLIC_API_BASE_URL === ''
|
: '/ollama/api';
|
||||||
? browser
|
|
||||||
? `http://${location.hostname}:11434/api`
|
|
||||||
: `http://localhost:11434/api`
|
|
||||||
: PUBLIC_API_BASE_URL;
|
|
||||||
|
|
||||||
export const WEBUI_BASE_URL = dev ? `http://${location.hostname}:8080` : ``;
|
export const WEBUI_BASE_URL = dev ? `http://${location.hostname}:8080` : ``;
|
||||||
export const WEBUI_API_BASE_URL = `${WEBUI_BASE_URL}/api/v1`;
|
export const WEBUI_API_BASE_URL = `${WEBUI_BASE_URL}/api/v1`;
|
||||||
|
|
|
@ -14,7 +14,7 @@
|
||||||
import { getOpenAIModels } from '$lib/apis/openai';
|
import { getOpenAIModels } from '$lib/apis/openai';
|
||||||
|
|
||||||
import { user, showSettings, settings, models, modelfiles, prompts } from '$lib/stores';
|
import { user, showSettings, settings, models, modelfiles, prompts } from '$lib/stores';
|
||||||
import { OLLAMA_API_BASE_URL, REQUIRED_OLLAMA_VERSION, WEBUI_API_BASE_URL } from '$lib/constants';
|
import { REQUIRED_OLLAMA_VERSION, WEBUI_API_BASE_URL } from '$lib/constants';
|
||||||
|
|
||||||
import SettingsModal from '$lib/components/chat/SettingsModal.svelte';
|
import SettingsModal from '$lib/components/chat/SettingsModal.svelte';
|
||||||
import Sidebar from '$lib/components/layout/Sidebar.svelte';
|
import Sidebar from '$lib/components/layout/Sidebar.svelte';
|
||||||
|
@ -32,10 +32,7 @@
|
||||||
const getModels = async () => {
|
const getModels = async () => {
|
||||||
let models = [];
|
let models = [];
|
||||||
models.push(
|
models.push(
|
||||||
...(await getOllamaModels(
|
...(await getOllamaModels(localStorage.token).catch((error) => {
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token
|
|
||||||
).catch((error) => {
|
|
||||||
toast.error(error);
|
toast.error(error);
|
||||||
return [];
|
return [];
|
||||||
}))
|
}))
|
||||||
|
@ -58,10 +55,7 @@
|
||||||
|
|
||||||
const setOllamaVersion = async (version: string = '') => {
|
const setOllamaVersion = async (version: string = '') => {
|
||||||
if (version === '') {
|
if (version === '') {
|
||||||
version = await getOllamaVersion(
|
version = await getOllamaVersion(localStorage.token).catch((error) => {
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token
|
|
||||||
).catch((error) => {
|
|
||||||
return '';
|
return '';
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
|
@ -7,7 +7,6 @@
|
||||||
import { page } from '$app/stores';
|
import { page } from '$app/stores';
|
||||||
|
|
||||||
import { models, modelfiles, user, settings, chats, chatId, config } from '$lib/stores';
|
import { models, modelfiles, user, settings, chats, chatId, config } from '$lib/stores';
|
||||||
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
|
||||||
|
|
||||||
import { generateChatCompletion, generateTitle } from '$lib/apis/ollama';
|
import { generateChatCompletion, generateTitle } from '$lib/apis/ollama';
|
||||||
import { copyToClipboard, splitStream } from '$lib/utils';
|
import { copyToClipboard, splitStream } from '$lib/utils';
|
||||||
|
@ -163,10 +162,7 @@
|
||||||
// Scroll down
|
// Scroll down
|
||||||
window.scrollTo({ top: document.body.scrollHeight });
|
window.scrollTo({ top: document.body.scrollHeight });
|
||||||
|
|
||||||
const res = await generateChatCompletion(
|
const res = await generateChatCompletion(localStorage.token, {
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
|
||||||
{
|
|
||||||
model: model,
|
model: model,
|
||||||
messages: [
|
messages: [
|
||||||
$settings.system
|
$settings.system
|
||||||
|
@ -191,8 +187,7 @@
|
||||||
...($settings.options ?? {})
|
...($settings.options ?? {})
|
||||||
},
|
},
|
||||||
format: $settings.requestFormat ?? undefined
|
format: $settings.requestFormat ?? undefined
|
||||||
}
|
});
|
||||||
);
|
|
||||||
|
|
||||||
if (res && res.ok) {
|
if (res && res.ok) {
|
||||||
const reader = res.body
|
const reader = res.body
|
||||||
|
@ -595,7 +590,6 @@
|
||||||
const generateChatTitle = async (_chatId, userPrompt) => {
|
const generateChatTitle = async (_chatId, userPrompt) => {
|
||||||
if ($settings.titleAutoGenerate ?? true) {
|
if ($settings.titleAutoGenerate ?? true) {
|
||||||
const title = await generateTitle(
|
const title = await generateTitle(
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
localStorage.token,
|
||||||
$settings?.titleAutoGenerateModel ?? selectedModels[0],
|
$settings?.titleAutoGenerateModel ?? selectedModels[0],
|
||||||
userPrompt
|
userPrompt
|
||||||
|
|
|
@ -7,7 +7,6 @@
|
||||||
import { page } from '$app/stores';
|
import { page } from '$app/stores';
|
||||||
|
|
||||||
import { models, modelfiles, user, settings, chats, chatId } from '$lib/stores';
|
import { models, modelfiles, user, settings, chats, chatId } from '$lib/stores';
|
||||||
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
|
||||||
|
|
||||||
import { generateChatCompletion, generateTitle } from '$lib/apis/ollama';
|
import { generateChatCompletion, generateTitle } from '$lib/apis/ollama';
|
||||||
import { copyToClipboard, splitStream } from '$lib/utils';
|
import { copyToClipboard, splitStream } from '$lib/utils';
|
||||||
|
@ -180,10 +179,7 @@
|
||||||
// Scroll down
|
// Scroll down
|
||||||
window.scrollTo({ top: document.body.scrollHeight });
|
window.scrollTo({ top: document.body.scrollHeight });
|
||||||
|
|
||||||
const res = await generateChatCompletion(
|
const res = await generateChatCompletion(localStorage.token, {
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
|
||||||
{
|
|
||||||
model: model,
|
model: model,
|
||||||
messages: [
|
messages: [
|
||||||
$settings.system
|
$settings.system
|
||||||
|
@ -208,8 +204,7 @@
|
||||||
...($settings.options ?? {})
|
...($settings.options ?? {})
|
||||||
},
|
},
|
||||||
format: $settings.requestFormat ?? undefined
|
format: $settings.requestFormat ?? undefined
|
||||||
}
|
});
|
||||||
);
|
|
||||||
|
|
||||||
if (res && res.ok) {
|
if (res && res.ok) {
|
||||||
const reader = res.body
|
const reader = res.body
|
||||||
|
@ -611,12 +606,7 @@
|
||||||
|
|
||||||
const generateChatTitle = async (_chatId, userPrompt) => {
|
const generateChatTitle = async (_chatId, userPrompt) => {
|
||||||
if ($settings.titleAutoGenerate ?? true) {
|
if ($settings.titleAutoGenerate ?? true) {
|
||||||
const title = await generateTitle(
|
const title = await generateTitle(localStorage.token, selectedModels[0], userPrompt);
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
|
||||||
selectedModels[0],
|
|
||||||
userPrompt
|
|
||||||
);
|
|
||||||
|
|
||||||
if (title) {
|
if (title) {
|
||||||
await setChatTitle(_chatId, title);
|
await setChatTitle(_chatId, title);
|
||||||
|
|
|
@ -6,7 +6,6 @@
|
||||||
import { onMount } from 'svelte';
|
import { onMount } from 'svelte';
|
||||||
|
|
||||||
import { modelfiles, settings, user } from '$lib/stores';
|
import { modelfiles, settings, user } from '$lib/stores';
|
||||||
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
|
||||||
import { createModel, deleteModel } from '$lib/apis/ollama';
|
import { createModel, deleteModel } from '$lib/apis/ollama';
|
||||||
import {
|
import {
|
||||||
createNewModelfile,
|
createNewModelfile,
|
||||||
|
@ -20,11 +19,7 @@
|
||||||
const deleteModelHandler = async (tagName) => {
|
const deleteModelHandler = async (tagName) => {
|
||||||
let success = null;
|
let success = null;
|
||||||
|
|
||||||
success = await deleteModel(
|
success = await deleteModel(localStorage.token, tagName);
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
|
||||||
tagName
|
|
||||||
);
|
|
||||||
|
|
||||||
if (success) {
|
if (success) {
|
||||||
toast.success(`Deleted ${tagName}`);
|
toast.success(`Deleted ${tagName}`);
|
||||||
|
|
|
@ -2,7 +2,6 @@
|
||||||
import { v4 as uuidv4 } from 'uuid';
|
import { v4 as uuidv4 } from 'uuid';
|
||||||
import { toast } from 'svelte-french-toast';
|
import { toast } from 'svelte-french-toast';
|
||||||
import { goto } from '$app/navigation';
|
import { goto } from '$app/navigation';
|
||||||
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
|
||||||
import { settings, user, config, modelfiles, models } from '$lib/stores';
|
import { settings, user, config, modelfiles, models } from '$lib/stores';
|
||||||
|
|
||||||
import Advanced from '$lib/components/chat/Settings/Advanced.svelte';
|
import Advanced from '$lib/components/chat/Settings/Advanced.svelte';
|
||||||
|
@ -132,12 +131,7 @@ SYSTEM """${system}"""`.replace(/^\s*\n/gm, '');
|
||||||
Object.keys(categories).filter((category) => categories[category]).length > 0 &&
|
Object.keys(categories).filter((category) => categories[category]).length > 0 &&
|
||||||
!$models.includes(tagName)
|
!$models.includes(tagName)
|
||||||
) {
|
) {
|
||||||
const res = await createModel(
|
const res = await createModel(localStorage.token, tagName, content);
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
|
||||||
tagName,
|
|
||||||
content
|
|
||||||
);
|
|
||||||
|
|
||||||
if (res) {
|
if (res) {
|
||||||
const reader = res.body
|
const reader = res.body
|
||||||
|
|
|
@ -7,8 +7,6 @@
|
||||||
import { page } from '$app/stores';
|
import { page } from '$app/stores';
|
||||||
|
|
||||||
import { settings, user, config, modelfiles } from '$lib/stores';
|
import { settings, user, config, modelfiles } from '$lib/stores';
|
||||||
|
|
||||||
import { OLLAMA_API_BASE_URL } from '$lib/constants';
|
|
||||||
import { splitStream } from '$lib/utils';
|
import { splitStream } from '$lib/utils';
|
||||||
|
|
||||||
import { createModel } from '$lib/apis/ollama';
|
import { createModel } from '$lib/apis/ollama';
|
||||||
|
@ -104,12 +102,7 @@
|
||||||
content !== '' &&
|
content !== '' &&
|
||||||
Object.keys(categories).filter((category) => categories[category]).length > 0
|
Object.keys(categories).filter((category) => categories[category]).length > 0
|
||||||
) {
|
) {
|
||||||
const res = await createModel(
|
const res = await createModel(localStorage.token, tagName, content);
|
||||||
$settings?.API_BASE_URL ?? OLLAMA_API_BASE_URL,
|
|
||||||
localStorage.token,
|
|
||||||
tagName,
|
|
||||||
content
|
|
||||||
);
|
|
||||||
|
|
||||||
if (res) {
|
if (res) {
|
||||||
const reader = res.body
|
const reader = res.body
|
||||||
|
|
Loading…
Reference in a new issue