From 6fb5a37170264fe4ddaef1679fd695fed3371dc8 Mon Sep 17 00:00:00 2001 From: Noah Date: Tue, 19 Dec 2023 20:19:43 +0100 Subject: [PATCH 01/20] Add WSL Ubuntu (webui docker + ollama seperately) to TROUBLESHOOTING.md --- TROUBLESHOOTING.md | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index d0d8ce2d..9a29d828 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -45,6 +45,15 @@ Becomes docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` +## Running ollama-webui as a container on WSL Ubuntu +If you're running ollama-webui via docker on WSL Ubuntu and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434. To resolve this, you can use the `--network=host` flag in the docker command. + +Here's an example of the command you should run: + +```bash +sudo docker run -d --network=host -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +``` + ## References [Change Docker Desktop Settings on Mac](https://docs.docker.com/desktop/settings/mac/) Search for "x86" in that page. From 30d3f6897d03027bed7ce2d729d967a3503cc3fd Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Tue, 19 Dec 2023 14:37:36 -0500 Subject: [PATCH 02/20] doc: backend dependency warning added --- README.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/README.md b/README.md index c5919688..6a652207 100644 --- a/README.md +++ b/README.md @@ -123,6 +123,10 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own. +**Warning: Backend Dependency for Proper Functionality** + +In order to ensure the seamless operation of our application, it is crucial to run both the backend and frontend components simultaneously. Serving only the frontend in isolation is not supported and may lead to unpredictable outcomes, rendering the application inoperable. Attempting to raise an issue when solely serving the frontend will not be addressed, as it falls outside the intended usage. To achieve optimal results, please strictly adhere to the specified steps outlined in this documentation. Utilize the frontend solely for building static files, and subsequently run the complete application with the provided backend commands. Failure to follow these instructions may result in unsupported configurations, and we may not be able to provide assistance in such cases. Your cooperation in following the prescribed procedures is essential for a smooth user experience and effective issue resolution. + ### TL;DR 🚀 Run the following commands to install: From 6a38bdcf77b41f8a1318f2406439cd3236dc37c2 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Tue, 19 Dec 2023 11:57:49 -0800 Subject: [PATCH 03/20] fix: better error message for connection issue --- src/routes/(app)/+layout.svelte | 57 +++++++++++++++++---------------- 1 file changed, 30 insertions(+), 27 deletions(-) diff --git a/src/routes/(app)/+layout.svelte b/src/routes/(app)/+layout.svelte index 4f477423..fe523d93 100644 --- a/src/routes/(app)/+layout.svelte +++ b/src/routes/(app)/+layout.svelte @@ -236,36 +236,39 @@
-
-
- Ollama Update Required -
+
+
+
+ Connection Issue or Update Needed +
-
- Oops! It seems like your Ollama needs a little attention. - We encountered a connection issue or noticed that you're running an outdated version. Please - update to - {requiredOllamaVersion} or above. -
+
+ Oops! It seems like your Ollama needs a little attention. We've detected either a connection hiccup or observed that you're using an older + version. Ensure you're on the latest Ollama version + (version + {requiredOllamaVersion} or higher) + or check your connection. +
-
- +
+ - + +
From 1d85f1ce831e10088386cbd1c328245ce1a0a359 Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Tue, 19 Dec 2023 15:36:32 -0500 Subject: [PATCH 04/20] doc: disclaimer added --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 6a652207..93cafc00 100644 --- a/README.md +++ b/README.md @@ -13,6 +13,8 @@ ChatGPT-Style Web Interface for Ollama 🦙 +**Disclaimer:** *ollama-webui is a community-driven project and is not affiliated with the Ollama team in any way. This initiative is independent, and any inquiries or feedback should be directed to [our community on Discord](https://discord.gg/5rJgQTnV4s). We kindly request users to refrain from contacting or harassing the Ollama team regarding this project.* + ![Ollama Web UI Demo](./demo.gif) Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles for Ollama! 🦙🔍 From 9f674aed4b94a70b505654577efa021110996cdc Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Tue, 19 Dec 2023 15:41:05 -0500 Subject: [PATCH 05/20] Update TROUBLESHOOTING.md --- TROUBLESHOOTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index 9a29d828..024f4732 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -51,7 +51,7 @@ If you're running ollama-webui via docker on WSL Ubuntu and have chosen to insta Here's an example of the command you should run: ```bash -sudo docker run -d --network=host -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +docker run -d --network=host -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` ## References From 3bcf440503c2fca8ac8e3aea5c9a07d447ebb445 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Tue, 19 Dec 2023 14:50:43 -0800 Subject: [PATCH 06/20] feat: file drag and drop support --- src/lib/components/chat/MessageInput.svelte | 64 +++++++++++++++++++++ 1 file changed, 64 insertions(+) diff --git a/src/lib/components/chat/MessageInput.svelte b/src/lib/components/chat/MessageInput.svelte index 172485ca..d83a5fcd 100644 --- a/src/lib/components/chat/MessageInput.svelte +++ b/src/lib/components/chat/MessageInput.svelte @@ -2,6 +2,7 @@ import { settings } from '$lib/stores'; import toast from 'svelte-french-toast'; import Suggestions from './MessageInput/Suggestions.svelte'; + import { onMount } from 'svelte'; export let submitPrompt: Function; export let stopResponse: Function; @@ -11,6 +12,7 @@ let filesInputElement; let inputFiles; + let dragged = false; export let files = []; @@ -82,8 +84,70 @@ } } }; + + onMount(() => { + const dropZone = document.querySelector('body'); + + dropZone?.addEventListener('dragover', (e) => { + e.preventDefault(); + dragged = true; + }); + + dropZone.addEventListener('drop', (e) => { + e.preventDefault(); + console.log(e); + + if (e.dataTransfer?.files) { + let reader = new FileReader(); + + reader.onload = (event) => { + files = [ + ...files, + { + type: 'image', + url: `${event.target.result}` + } + ]; + }; + + if ( + e.dataTransfer?.files && + e.dataTransfer?.files.length > 0 && + ['image/gif', 'image/jpeg', 'image/png'].includes(e.dataTransfer?.files[0]['type']) + ) { + reader.readAsDataURL(e.dataTransfer?.files[0]); + } else { + toast.error(`Unsupported File Type '${e.dataTransfer?.files[0]['type']}'.`); + } + } + + dragged = false; + }); + }); +{#if dragged} +
+
+
+
+
🏞️
+
Add Images
+ +
+ Drop any images here to add to the conversation +
+
+
+
+
+{/if} +
{#if messages.length == 0 && suggestionPrompts.length !== 0} From 344c91e37a43c46790e1338b134561f3e79ab570 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Tue, 19 Dec 2023 14:53:14 -0800 Subject: [PATCH 07/20] fix: dragleave event added --- src/lib/components/chat/MessageInput.svelte | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/src/lib/components/chat/MessageInput.svelte b/src/lib/components/chat/MessageInput.svelte index d83a5fcd..54291f22 100644 --- a/src/lib/components/chat/MessageInput.svelte +++ b/src/lib/components/chat/MessageInput.svelte @@ -123,6 +123,10 @@ dragged = false; }); + + dropZone?.addEventListener('dragleave', () => { + dragged = false; + }); }); From 110498cad68ebcd3ea8eadc30ddbd1610481cec8 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Tue, 19 Dec 2023 14:59:19 -0800 Subject: [PATCH 08/20] fix: disable message image drag --- src/lib/components/chat/Messages.svelte | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/src/lib/components/chat/Messages.svelte b/src/lib/components/chat/Messages.svelte index c6d53424..9a39089d 100644 --- a/src/lib/components/chat/Messages.svelte +++ b/src/lib/components/chat/Messages.svelte @@ -469,7 +469,12 @@ {#each message.files as file}
{#if file.type === 'image'} - input + input {/if}
{/each} From 25e85c0ee3bc330ff74bc436097aca52bdcaf420 Mon Sep 17 00:00:00 2001 From: Yasushiko Date: Tue, 19 Dec 2023 22:10:54 -0500 Subject: [PATCH 09/20] Update TROUBLESHOOTING.md --- TROUBLESHOOTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index 024f4732..2fabe497 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -46,7 +46,7 @@ docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http:// ``` ## Running ollama-webui as a container on WSL Ubuntu -If you're running ollama-webui via docker on WSL Ubuntu and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434. To resolve this, you can use the `--network=host` flag in the docker command. +If you're running ollama-webui via docker on WSL Ubuntu and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434. To resolve this, you can use the `--network=host` flag in the docker command. When done so port would be changed from 3000 to 8080, and the link would be: http://localhost:8080. Here's an example of the command you should run: From 0f2360a384f9b24bd56c8a933c54e396a36b51f1 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Tue, 19 Dec 2023 20:08:47 -0800 Subject: [PATCH 10/20] fix: file drag and drop overlay --- src/lib/components/chat/MessageInput.svelte | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/lib/components/chat/MessageInput.svelte b/src/lib/components/chat/MessageInput.svelte index 54291f22..57f48bef 100644 --- a/src/lib/components/chat/MessageInput.svelte +++ b/src/lib/components/chat/MessageInput.svelte @@ -132,13 +132,13 @@ {#if dragged}
-
+
🏞️
Add Images
From c57c72cc39bbe6498771276162e39f32a33e798f Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Tue, 19 Dec 2023 20:10:58 -0800 Subject: [PATCH 11/20] fix: disable profile image drag --- src/lib/components/chat/Messages.svelte | 18 ++++++++++++++++-- 1 file changed, 16 insertions(+), 2 deletions(-) diff --git a/src/lib/components/chat/Messages.svelte b/src/lib/components/chat/Messages.svelte index 9a39089d..543ce6a5 100644 --- a/src/lib/components/chat/Messages.svelte +++ b/src/lib/components/chat/Messages.svelte @@ -362,9 +362,19 @@
{#if selectedModelfile && selectedModelfile.imageUrl} - + modelfile {:else} - + ollama {/if}
@@ -401,12 +411,14 @@ src="{$settings.gravatarUrl ? $settings.gravatarUrl : '/user'}.png" class=" max-w-[28px] object-cover rounded-full" alt="User profile" + draggable="false" /> {:else} User profile {/if} {:else if selectedModelfile} @@ -414,12 +426,14 @@ src={selectedModelfile?.imageUrl ?? '/favicon.png'} class=" max-w-[28px] object-cover rounded-full" alt="Ollama profile" + draggable="false" /> {:else} Ollama profile {/if}
From a25761a42387b239232065e889bd92aefc4511b0 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 22 Dec 2023 01:16:03 -0800 Subject: [PATCH 12/20] fix: custom modelfile url search param --- src/routes/(app)/modelfiles/+page.svelte | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/routes/(app)/modelfiles/+page.svelte b/src/routes/(app)/modelfiles/+page.svelte index ff7519da..297766f9 100644 --- a/src/routes/(app)/modelfiles/+page.svelte +++ b/src/routes/(app)/modelfiles/+page.svelte @@ -98,7 +98,7 @@
@@ -121,7 +121,7 @@ Date: Fri, 22 Dec 2023 01:20:32 -0800 Subject: [PATCH 13/20] fix: duplicate modelfile issue --- src/routes/(app)/modelfiles/create/+page.svelte | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/src/routes/(app)/modelfiles/create/+page.svelte b/src/routes/(app)/modelfiles/create/+page.svelte index 18dee2d3..505ab02d 100644 --- a/src/routes/(app)/modelfiles/create/+page.svelte +++ b/src/routes/(app)/modelfiles/create/+page.svelte @@ -93,7 +93,10 @@ SYSTEM """${system}"""`.replace(/^\s*\n/gm, ''); }; const saveModelfile = async (modelfile) => { - await modelfiles.set([...$modelfiles, modelfile]); + await modelfiles.set([ + ...$modelfiles.filter((m) => m.tagName !== modelfile.tagName), + modelfile + ]); localStorage.setItem('modelfiles', JSON.stringify($modelfiles)); }; From 70435092967533adaeecb340a39ab683bed96b94 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 22 Dec 2023 01:33:09 -0800 Subject: [PATCH 14/20] chore: version update --- backend/config.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/backend/config.py b/backend/config.py index 1dabe48a..6abea5ed 100644 --- a/backend/config.py +++ b/backend/config.py @@ -30,7 +30,7 @@ if ENV == "prod": # WEBUI_VERSION #################################### -WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.33") +WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.34") #################################### # WEBUI_AUTH From f62228165c986f3b650690cdfdde1907045aa5e3 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 22 Dec 2023 20:09:50 -0800 Subject: [PATCH 15/20] feat: custom openai api endpoint support --- src/lib/components/chat/SettingsModal.svelte | 119 +++++++-- src/routes/(app)/+layout.svelte | 10 +- src/routes/(app)/+page.svelte | 245 +++++++++++-------- src/routes/(app)/c/[id]/+page.svelte | 245 +++++++++++-------- 4 files changed, 381 insertions(+), 238 deletions(-) diff --git a/src/lib/components/chat/SettingsModal.svelte b/src/lib/components/chat/SettingsModal.svelte index 67617218..dbca0e79 100644 --- a/src/lib/components/chat/SettingsModal.svelte +++ b/src/lib/components/chat/SettingsModal.svelte @@ -56,6 +56,7 @@ let gravatarEmail = ''; let OPENAI_API_KEY = ''; + let OPENAI_API_BASE_URL = ''; // Auth let authEnabled = false; @@ -302,8 +303,10 @@ // If OpenAI API Key exists if (type === 'all' && $settings.OPENAI_API_KEY) { + const API_BASE_URL = $settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'; + // Validate OPENAI_API_KEY - const openaiModelRes = await fetch(`https://api.openai.com/v1/models`, { + const openaiModelRes = await fetch(`${API_BASE_URL}/models`, { method: 'GET', headers: { 'Content-Type': 'application/json', @@ -327,8 +330,10 @@ ? [ { name: 'hr' }, ...openAIModels - .map((model) => ({ name: model.id, label: 'OpenAI' })) - .filter((model) => model.name.includes('gpt')) + .map((model) => ({ name: model.id, external: true })) + .filter((model) => + API_BASE_URL.includes('openai') ? model.name.includes('gpt') : true + ) ] : []) ); @@ -363,6 +368,7 @@ gravatarEmail = settings.gravatarEmail ?? ''; OPENAI_API_KEY = settings.OPENAI_API_KEY ?? ''; + OPENAI_API_BASE_URL = settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'; authEnabled = settings.authHeader !== undefined ? true : false; if (authEnabled) { @@ -476,6 +482,30 @@
Models
+ + +
+ {:else if selectedTab === 'addons'}
{ saveSettings({ gravatarEmail: gravatarEmail !== '' ? gravatarEmail : undefined, - gravatarUrl: gravatarEmail !== '' ? getGravatarURL(gravatarEmail) : undefined, - OPENAI_API_KEY: OPENAI_API_KEY !== '' ? OPENAI_API_KEY : undefined + gravatarUrl: gravatarEmail !== '' ? getGravatarURL(gravatarEmail) : undefined }); show = false; }} @@ -962,26 +1051,6 @@ >
- -
-
-
- OpenAI API Key (optional) -
-
-
- -
-
-
- Adds optional support for 'gpt-*' models available. -
-
diff --git a/src/routes/(app)/+layout.svelte b/src/routes/(app)/+layout.svelte index fe523d93..94d242e1 100644 --- a/src/routes/(app)/+layout.svelte +++ b/src/routes/(app)/+layout.svelte @@ -55,7 +55,9 @@ // If OpenAI API Key exists if ($settings.OPENAI_API_KEY) { // Validate OPENAI_API_KEY - const openaiModelRes = await fetch(`https://api.openai.com/v1/models`, { + + const API_BASE_URL = $settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'; + const openaiModelRes = await fetch(`${API_BASE_URL}/models`, { method: 'GET', headers: { 'Content-Type': 'application/json', @@ -79,8 +81,10 @@ ? [ { name: 'hr' }, ...openAIModels - .map((model) => ({ name: model.id, label: 'OpenAI' })) - .filter((model) => model.name.includes('gpt')) + .map((model) => ({ name: model.id, external: true })) + .filter((model) => + API_BASE_URL.includes('openai') ? model.name.includes('gpt') : true + ) ] : []) ); diff --git a/src/routes/(app)/+page.svelte b/src/routes/(app)/+page.svelte index d0b83b80..29e6b518 100644 --- a/src/routes/(app)/+page.svelte +++ b/src/routes/(app)/+page.svelte @@ -7,7 +7,7 @@ import { splitStream } from '$lib/utils'; import { goto } from '$app/navigation'; - import { config, modelfiles, user, settings, db, chats, chatId } from '$lib/stores'; + import { config, models, modelfiles, user, settings, db, chats, chatId } from '$lib/stores'; import MessageInput from '$lib/components/chat/MessageInput.svelte'; import Messages from '$lib/components/chat/Messages.svelte'; @@ -130,7 +130,8 @@ const sendPrompt = async (userPrompt, parentId, _chatId) => { await Promise.all( selectedModels.map(async (model) => { - if (model.includes('gpt-')) { + console.log(model); + if ($models.filter((m) => m.name === model)[0].external) { await sendPromptOpenAI(model, userPrompt, parentId, _chatId); } else { await sendPromptOllama(model, userPrompt, parentId, _chatId); @@ -368,129 +369,163 @@ window.scrollTo({ top: document.body.scrollHeight }); - const res = await fetch(`https://api.openai.com/v1/chat/completions`, { - method: 'POST', - headers: { - 'Content-Type': 'application/json', - Authorization: `Bearer ${$settings.OPENAI_API_KEY}` - }, - body: JSON.stringify({ - model: model, - stream: true, - messages: [ - $settings.system - ? { - role: 'system', - content: $settings.system - } - : undefined, - ...messages - ] - .filter((message) => message) - .map((message) => ({ - role: message.role, - ...(message.files + const res = await fetch( + `${$settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'}/chat/completions`, + { + method: 'POST', + headers: { + Authorization: `Bearer ${$settings.OPENAI_API_KEY}`, + 'Content-Type': 'application/json', + 'HTTP-Referer': `https://ollamahub.com/`, + 'X-Title': `Ollama WebUI` + }, + body: JSON.stringify({ + model: model, + stream: true, + messages: [ + $settings.system ? { - content: [ - { - type: 'text', - text: message.content - }, - ...message.files - .filter((file) => file.type === 'image') - .map((file) => ({ - type: 'image_url', - image_url: { - url: file.url - } - })) - ] + role: 'system', + content: $settings.system } - : { content: message.content }) - })), - temperature: $settings.temperature ?? undefined, - top_p: $settings.top_p ?? undefined, - num_ctx: $settings.num_ctx ?? undefined, - frequency_penalty: $settings.repeat_penalty ?? undefined - }) + : undefined, + ...messages + ] + .filter((message) => message) + .map((message) => ({ + role: message.role, + ...(message.files + ? { + content: [ + { + type: 'text', + text: message.content + }, + ...message.files + .filter((file) => file.type === 'image') + .map((file) => ({ + type: 'image_url', + image_url: { + url: file.url + } + })) + ] + } + : { content: message.content }) + })), + temperature: $settings.temperature ?? undefined, + top_p: $settings.top_p ?? undefined, + num_ctx: $settings.num_ctx ?? undefined, + frequency_penalty: $settings.repeat_penalty ?? undefined + }) + } + ).catch((err) => { + console.log(err); + return null; }); - const reader = res.body - .pipeThrough(new TextDecoderStream()) - .pipeThrough(splitStream('\n')) - .getReader(); + if (res && res.ok) { + const reader = res.body + .pipeThrough(new TextDecoderStream()) + .pipeThrough(splitStream('\n')) + .getReader(); - while (true) { - const { value, done } = await reader.read(); - if (done || stopResponseFlag || _chatId !== $chatId) { - responseMessage.done = true; - messages = messages; - break; - } + while (true) { + const { value, done } = await reader.read(); + if (done || stopResponseFlag || _chatId !== $chatId) { + responseMessage.done = true; + messages = messages; + break; + } - try { - let lines = value.split('\n'); + try { + let lines = value.split('\n'); - for (const line of lines) { - if (line !== '') { - console.log(line); - if (line === 'data: [DONE]') { - responseMessage.done = true; - messages = messages; - } else { - let data = JSON.parse(line.replace(/^data: /, '')); - console.log(data); - - if (responseMessage.content == '' && data.choices[0].delta.content == '\n') { - continue; - } else { - responseMessage.content += data.choices[0].delta.content ?? ''; + for (const line of lines) { + if (line !== '') { + console.log(line); + if (line === 'data: [DONE]') { + responseMessage.done = true; messages = messages; + } else { + let data = JSON.parse(line.replace(/^data: /, '')); + console.log(data); + + if (responseMessage.content == '' && data.choices[0].delta.content == '\n') { + continue; + } else { + responseMessage.content += data.choices[0].delta.content ?? ''; + messages = messages; + } } } } + } catch (error) { + console.log(error); } - } catch (error) { + + if ($settings.notificationEnabled && !document.hasFocus()) { + const notification = new Notification(`OpenAI ${model}`, { + body: responseMessage.content, + icon: '/favicon.png' + }); + } + + if ($settings.responseAutoCopy) { + copyToClipboard(responseMessage.content); + } + + if (autoScroll) { + window.scrollTo({ top: document.body.scrollHeight }); + } + + await $db.updateChatById(_chatId, { + title: title === '' ? 'New Chat' : title, + models: selectedModels, + system: $settings.system ?? undefined, + options: { + seed: $settings.seed ?? undefined, + temperature: $settings.temperature ?? undefined, + repeat_penalty: $settings.repeat_penalty ?? undefined, + top_k: $settings.top_k ?? undefined, + top_p: $settings.top_p ?? undefined, + num_ctx: $settings.num_ctx ?? undefined, + ...($settings.options ?? {}) + }, + messages: messages, + history: history + }); + } + } else { + if (res !== null) { + const error = await res.json(); console.log(error); + if ('detail' in error) { + toast.error(error.detail); + responseMessage.content = error.detail; + } else { + if ('message' in error.error) { + toast.error(error.error.message); + responseMessage.content = error.error.message; + } else { + toast.error(error.error); + responseMessage.content = error.error; + } + } + } else { + toast.error(`Uh-oh! There was an issue connecting to ${model}.`); + responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`; } - if (autoScroll) { - window.scrollTo({ top: document.body.scrollHeight }); - } - - await $db.updateChatById(_chatId, { - title: title === '' ? 'New Chat' : title, - models: selectedModels, - system: $settings.system ?? undefined, - options: { - seed: $settings.seed ?? undefined, - temperature: $settings.temperature ?? undefined, - repeat_penalty: $settings.repeat_penalty ?? undefined, - top_k: $settings.top_k ?? undefined, - top_p: $settings.top_p ?? undefined, - num_ctx: $settings.num_ctx ?? undefined, - ...($settings.options ?? {}) - }, - messages: messages, - history: history - }); + responseMessage.error = true; + responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`; + responseMessage.done = true; + messages = messages; } stopResponseFlag = false; - await tick(); - if ($settings.notificationEnabled && !document.hasFocus()) { - const notification = new Notification(`OpenAI ${model}`, { - body: responseMessage.content, - icon: '/favicon.png' - }); - } - - if ($settings.responseAutoCopy) { - copyToClipboard(responseMessage.content); - } - if (autoScroll) { window.scrollTo({ top: document.body.scrollHeight }); } diff --git a/src/routes/(app)/c/[id]/+page.svelte b/src/routes/(app)/c/[id]/+page.svelte index bf7207fb..a33e8feb 100644 --- a/src/routes/(app)/c/[id]/+page.svelte +++ b/src/routes/(app)/c/[id]/+page.svelte @@ -6,7 +6,7 @@ import { onMount, tick } from 'svelte'; import { convertMessagesToHistory, splitStream } from '$lib/utils'; import { goto } from '$app/navigation'; - import { config, modelfiles, user, settings, db, chats, chatId } from '$lib/stores'; + import { config, models, modelfiles, user, settings, db, chats, chatId } from '$lib/stores'; import MessageInput from '$lib/components/chat/MessageInput.svelte'; import Messages from '$lib/components/chat/Messages.svelte'; @@ -144,7 +144,8 @@ const sendPrompt = async (userPrompt, parentId, _chatId) => { await Promise.all( selectedModels.map(async (model) => { - if (model.includes('gpt-')) { + console.log(model); + if ($models.filter((m) => m.name === model)[0].external) { await sendPromptOpenAI(model, userPrompt, parentId, _chatId); } else { await sendPromptOllama(model, userPrompt, parentId, _chatId); @@ -382,129 +383,163 @@ window.scrollTo({ top: document.body.scrollHeight }); - const res = await fetch(`https://api.openai.com/v1/chat/completions`, { - method: 'POST', - headers: { - 'Content-Type': 'application/json', - Authorization: `Bearer ${$settings.OPENAI_API_KEY}` - }, - body: JSON.stringify({ - model: model, - stream: true, - messages: [ - $settings.system - ? { - role: 'system', - content: $settings.system - } - : undefined, - ...messages - ] - .filter((message) => message) - .map((message) => ({ - role: message.role, - ...(message.files + const res = await fetch( + `${$settings.OPENAI_API_BASE_URL ?? 'https://api.openai.com/v1'}/chat/completions`, + { + method: 'POST', + headers: { + Authorization: `Bearer ${$settings.OPENAI_API_KEY}`, + 'Content-Type': 'application/json', + 'HTTP-Referer': `https://ollamahub.com/`, + 'X-Title': `Ollama WebUI` + }, + body: JSON.stringify({ + model: model, + stream: true, + messages: [ + $settings.system ? { - content: [ - { - type: 'text', - text: message.content - }, - ...message.files - .filter((file) => file.type === 'image') - .map((file) => ({ - type: 'image_url', - image_url: { - url: file.url - } - })) - ] + role: 'system', + content: $settings.system } - : { content: message.content }) - })), - temperature: $settings.temperature ?? undefined, - top_p: $settings.top_p ?? undefined, - num_ctx: $settings.num_ctx ?? undefined, - frequency_penalty: $settings.repeat_penalty ?? undefined - }) + : undefined, + ...messages + ] + .filter((message) => message) + .map((message) => ({ + role: message.role, + ...(message.files + ? { + content: [ + { + type: 'text', + text: message.content + }, + ...message.files + .filter((file) => file.type === 'image') + .map((file) => ({ + type: 'image_url', + image_url: { + url: file.url + } + })) + ] + } + : { content: message.content }) + })), + temperature: $settings.temperature ?? undefined, + top_p: $settings.top_p ?? undefined, + num_ctx: $settings.num_ctx ?? undefined, + frequency_penalty: $settings.repeat_penalty ?? undefined + }) + } + ).catch((err) => { + console.log(err); + return null; }); - const reader = res.body - .pipeThrough(new TextDecoderStream()) - .pipeThrough(splitStream('\n')) - .getReader(); + if (res && res.ok) { + const reader = res.body + .pipeThrough(new TextDecoderStream()) + .pipeThrough(splitStream('\n')) + .getReader(); - while (true) { - const { value, done } = await reader.read(); - if (done || stopResponseFlag || _chatId !== $chatId) { - responseMessage.done = true; - messages = messages; - break; - } + while (true) { + const { value, done } = await reader.read(); + if (done || stopResponseFlag || _chatId !== $chatId) { + responseMessage.done = true; + messages = messages; + break; + } - try { - let lines = value.split('\n'); + try { + let lines = value.split('\n'); - for (const line of lines) { - if (line !== '') { - console.log(line); - if (line === 'data: [DONE]') { - responseMessage.done = true; - messages = messages; - } else { - let data = JSON.parse(line.replace(/^data: /, '')); - console.log(data); - - if (responseMessage.content == '' && data.choices[0].delta.content == '\n') { - continue; - } else { - responseMessage.content += data.choices[0].delta.content ?? ''; + for (const line of lines) { + if (line !== '') { + console.log(line); + if (line === 'data: [DONE]') { + responseMessage.done = true; messages = messages; + } else { + let data = JSON.parse(line.replace(/^data: /, '')); + console.log(data); + + if (responseMessage.content == '' && data.choices[0].delta.content == '\n') { + continue; + } else { + responseMessage.content += data.choices[0].delta.content ?? ''; + messages = messages; + } } } } + } catch (error) { + console.log(error); } - } catch (error) { + + if ($settings.notificationEnabled && !document.hasFocus()) { + const notification = new Notification(`OpenAI ${model}`, { + body: responseMessage.content, + icon: '/favicon.png' + }); + } + + if ($settings.responseAutoCopy) { + copyToClipboard(responseMessage.content); + } + + if (autoScroll) { + window.scrollTo({ top: document.body.scrollHeight }); + } + + await $db.updateChatById(_chatId, { + title: title === '' ? 'New Chat' : title, + models: selectedModels, + system: $settings.system ?? undefined, + options: { + seed: $settings.seed ?? undefined, + temperature: $settings.temperature ?? undefined, + repeat_penalty: $settings.repeat_penalty ?? undefined, + top_k: $settings.top_k ?? undefined, + top_p: $settings.top_p ?? undefined, + num_ctx: $settings.num_ctx ?? undefined, + ...($settings.options ?? {}) + }, + messages: messages, + history: history + }); + } + } else { + if (res !== null) { + const error = await res.json(); console.log(error); + if ('detail' in error) { + toast.error(error.detail); + responseMessage.content = error.detail; + } else { + if ('message' in error.error) { + toast.error(error.error.message); + responseMessage.content = error.error.message; + } else { + toast.error(error.error); + responseMessage.content = error.error; + } + } + } else { + toast.error(`Uh-oh! There was an issue connecting to ${model}.`); + responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`; } - if (autoScroll) { - window.scrollTo({ top: document.body.scrollHeight }); - } - - await $db.updateChatById(_chatId, { - title: title === '' ? 'New Chat' : title, - models: selectedModels, - system: $settings.system ?? undefined, - options: { - seed: $settings.seed ?? undefined, - temperature: $settings.temperature ?? undefined, - repeat_penalty: $settings.repeat_penalty ?? undefined, - top_k: $settings.top_k ?? undefined, - top_p: $settings.top_p ?? undefined, - num_ctx: $settings.num_ctx ?? undefined, - ...($settings.options ?? {}) - }, - messages: messages, - history: history - }); + responseMessage.error = true; + responseMessage.content = `Uh-oh! There was an issue connecting to ${model}.`; + responseMessage.done = true; + messages = messages; } stopResponseFlag = false; - await tick(); - if ($settings.notificationEnabled && !document.hasFocus()) { - const notification = new Notification(`OpenAI ${model}`, { - body: responseMessage.content, - icon: '/favicon.png' - }); - } - - if ($settings.responseAutoCopy) { - copyToClipboard(responseMessage.content); - } - if (autoScroll) { window.scrollTo({ top: document.body.scrollHeight }); } From 0fcdee60cd2d121b45c59e2dfdad33ea0b903f5c Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 22 Dec 2023 20:10:17 -0800 Subject: [PATCH 16/20] chore: version update --- backend/config.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/backend/config.py b/backend/config.py index 6abea5ed..c5a79f57 100644 --- a/backend/config.py +++ b/backend/config.py @@ -30,7 +30,7 @@ if ENV == "prod": # WEBUI_VERSION #################################### -WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.34") +WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.35") #################################### # WEBUI_AUTH From ecc2466f1ed21b45147e967ebe7b44ff531904e8 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 22 Dec 2023 20:31:42 -0800 Subject: [PATCH 17/20] feat: alternative models response support --- src/lib/components/chat/SettingsModal.svelte | 4 +++- src/routes/(app)/+layout.svelte | 4 +++- 2 files changed, 6 insertions(+), 2 deletions(-) diff --git a/src/lib/components/chat/SettingsModal.svelte b/src/lib/components/chat/SettingsModal.svelte index dbca0e79..5d4334c4 100644 --- a/src/lib/components/chat/SettingsModal.svelte +++ b/src/lib/components/chat/SettingsModal.svelte @@ -323,7 +323,9 @@ return null; }); - const openAIModels = openaiModelRes?.data ?? null; + const openAIModels = Array.isArray(openaiModelRes) + ? openaiModelRes + : openaiModelRes?.data ?? null; models.push( ...(openAIModels diff --git a/src/routes/(app)/+layout.svelte b/src/routes/(app)/+layout.svelte index 94d242e1..af8c7522 100644 --- a/src/routes/(app)/+layout.svelte +++ b/src/routes/(app)/+layout.svelte @@ -74,7 +74,9 @@ return null; }); - const openAIModels = openaiModelRes?.data ?? null; + const openAIModels = Array.isArray(openaiModelRes) + ? openaiModelRes + : openaiModelRes?.data ?? null; models.push( ...(openAIModels From b79c06023b2cc2a8c8b0797893c75f361ad2ff59 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 22 Dec 2023 20:40:17 -0800 Subject: [PATCH 18/20] fix: custom suggestion prompts styling --- src/lib/components/chat/MessageInput.svelte | 2 +- src/lib/components/chat/MessageInput/Suggestions.svelte | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/src/lib/components/chat/MessageInput.svelte b/src/lib/components/chat/MessageInput.svelte index 57f48bef..bb941c90 100644 --- a/src/lib/components/chat/MessageInput.svelte +++ b/src/lib/components/chat/MessageInput.svelte @@ -155,7 +155,7 @@
{#if messages.length == 0 && suggestionPrompts.length !== 0} -
+
{/if} diff --git a/src/lib/components/chat/MessageInput/Suggestions.svelte b/src/lib/components/chat/MessageInput/Suggestions.svelte index 6bd1876b..58c75fd1 100644 --- a/src/lib/components/chat/MessageInput/Suggestions.svelte +++ b/src/lib/components/chat/MessageInput/Suggestions.svelte @@ -3,7 +3,7 @@ export let suggestionPrompts = []; -
+
{#each suggestionPrompts as prompt, promptIdx}