forked from open-webui/open-webui
Merge pull request #180 from ollama-webui/doc-update
doc: local build tldr section added
This commit is contained in:
commit
597ba08026
3 changed files with 38 additions and 13 deletions
28
README.md
28
README.md
|
@ -35,7 +35,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
|
||||||
|
|
||||||
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
|
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
|
||||||
|
|
||||||
- 🗃️ **Modelfile Builder**: Easily create Ollama modelfiles via the web UI. Create and add your own character to Ollama by customizing system prompts, conversation starters, and more.
|
- 🧩 **Modelfile Builder**: Easily create Ollama modelfiles via the web UI. Create and add characters/agents, customize chat elements, and import modelfiles effortlessly through [OllamaHub](https://ollamahub.com/) integration.
|
||||||
|
|
||||||
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
|
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
|
||||||
|
|
||||||
|
@ -59,7 +59,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
|
||||||
|
|
||||||
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.
|
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.
|
||||||
|
|
||||||
## 🔗 Also Check Out OllamaHub!
|
## 🔗 Also Check Out OllamaHub!
|
||||||
|
|
||||||
Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles. OllamaHub offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀
|
Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles. OllamaHub offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀
|
||||||
|
|
||||||
|
@ -121,6 +121,29 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name
|
||||||
|
|
||||||
While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own.
|
While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own.
|
||||||
|
|
||||||
|
### TL;DR 🚀
|
||||||
|
|
||||||
|
Run the following commands to install:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
git clone https://github.com/ollama-webui/ollama-webui.git
|
||||||
|
cd ollama-webui/
|
||||||
|
|
||||||
|
# Copying required .env file
|
||||||
|
cp -RPp example.env .env
|
||||||
|
|
||||||
|
# Building Frontend
|
||||||
|
npm i
|
||||||
|
npm run build
|
||||||
|
|
||||||
|
# Serving Frontend with the Backend
|
||||||
|
cd ./backend
|
||||||
|
pip install -r requirements.txt
|
||||||
|
sh start.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
You should have the Ollama Web UI up and running at http://localhost:8080/. Enjoy! 😄
|
||||||
|
|
||||||
### Project Components
|
### Project Components
|
||||||
|
|
||||||
The Ollama Web UI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Both need to be running concurrently for the development environment using `npm run dev`. Alternatively, you can set the `PUBLIC_API_BASE_URL` during the build process to have the frontend connect directly to your Ollama instance or build the frontend as static files and serve them with the backend.
|
The Ollama Web UI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Both need to be running concurrently for the development environment using `npm run dev`. Alternatively, you can set the `PUBLIC_API_BASE_URL` during the build process to have the frontend connect directly to your Ollama instance or build the frontend as static files and serve them with the backend.
|
||||||
|
@ -211,7 +234,6 @@ See [TROUBLESHOOTING.md](/TROUBLESHOOTING.md) for information on how to troubles
|
||||||
|
|
||||||
Here are some exciting tasks on our roadmap:
|
Here are some exciting tasks on our roadmap:
|
||||||
|
|
||||||
|
|
||||||
- 🔄 **Multi-Modal Support**: Seamlessly engage with models that support multimodal interactions, including images (e.g., LLava).
|
- 🔄 **Multi-Modal Support**: Seamlessly engage with models that support multimodal interactions, including images (e.g., LLava).
|
||||||
- 📚 **RAG Integration**: Experience first-class retrieval augmented generation support, enabling chat with your documents.
|
- 📚 **RAG Integration**: Experience first-class retrieval augmented generation support, enabling chat with your documents.
|
||||||
- 🔐 **Access Control**: Securely manage requests to Ollama by utilizing the backend as a reverse proxy gateway, ensuring only authenticated users can send specific requests.
|
- 🔐 **Access Control**: Securely manage requests to Ollama by utilizing the backend as a reverse proxy gateway, ensuring only authenticated users can send specific requests.
|
||||||
|
|
|
@ -6,8 +6,7 @@ from secrets import token_bytes
|
||||||
from base64 import b64encode
|
from base64 import b64encode
|
||||||
import os
|
import os
|
||||||
|
|
||||||
|
load_dotenv(find_dotenv("../.env"))
|
||||||
load_dotenv(find_dotenv())
|
|
||||||
|
|
||||||
####################################
|
####################################
|
||||||
# ENV (dev,test,prod)
|
# ENV (dev,test,prod)
|
||||||
|
@ -38,7 +37,7 @@ WEBUI_VERSION = os.environ.get("WEBUI_VERSION", "v1.0.0-alpha.21")
|
||||||
####################################
|
####################################
|
||||||
|
|
||||||
|
|
||||||
WEBUI_AUTH = True if os.environ.get("WEBUI_AUTH", "TRUE") == "TRUE" else False
|
WEBUI_AUTH = True if os.environ.get("WEBUI_AUTH", "FALSE") == "TRUE" else False
|
||||||
|
|
||||||
|
|
||||||
####################################
|
####################################
|
||||||
|
|
18
example.env
18
example.env
|
@ -1,8 +1,12 @@
|
||||||
# must be defined, but defaults to 'http://{location.hostname}:11434/api'
|
# If you're serving both the frontend and backend (Recommended)
|
||||||
# can also use path, such as '/api'
|
# Set the public API base URL for seamless communication
|
||||||
PUBLIC_API_BASE_URL=''
|
PUBLIC_API_BASE_URL='/ollama/api'
|
||||||
|
|
||||||
OLLAMA_API_ID='my-api-token'
|
# If you're serving only the frontend (Not recommended and not fully supported)
|
||||||
OLLAMA_API_TOKEN='xxxxxxxxxxxxxxxx'
|
# Comment above and Uncomment below
|
||||||
# generated by passing the token to `caddy hash-password`
|
# You can use the default value or specify a custom path, e.g., '/api'
|
||||||
OLLAMA_API_TOKEN_DIGEST='$2a$14$iyyuawykR92xTHNR9lWzfu.uCct/9/xUPX3zBqLqrjAu0usNRPbyi'
|
# PUBLIC_API_BASE_URL='http://{location.hostname}:11434/api'
|
||||||
|
|
||||||
|
# Ollama URL for the backend to connect
|
||||||
|
# The path '/ollama/api' will be redirected to the specified backend URL
|
||||||
|
OLLAMA_API_BASE_URL='http://localhost:11434/api'
|
||||||
|
|
Loading…
Reference in a new issue