Merge branch 'main' into fix-openai-settings

This commit is contained in:
Timothy Jaeryang Baek 2023-12-29 21:05:49 -05:00 committed by GitHub
commit b919ac76af
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
4 changed files with 40 additions and 21 deletions

View file

@ -61,7 +61,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
- 🔐 **Role-Based Access Control (RBAC)**: Ensure secure access with restricted permissions; only authorized individuals can access your Ollama, and exclusive model creation/pulling rights are reserved for administrators. - 🔐 **Role-Based Access Control (RBAC)**: Ensure secure access with restricted permissions; only authorized individuals can access your Ollama, and exclusive model creation/pulling rights are reserved for administrators.
- 🔒 **Backend Reverse Proxy Support**: Strengthen security by enabling direct communication between Ollama Web UI backend and Ollama, eliminating the need to expose Ollama over LAN. - 🔒 **Backend Reverse Proxy Support**: Bolster security through direct communication between Ollama Web UI backend and Ollama. This key feature eliminates the need to expose Ollama over LAN. Requests made to the '/ollama/api' route from the web UI are seamlessly redirected to Ollama from the backend, enhancing overall system security.
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features. - 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.
@ -71,11 +71,14 @@ Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/)
## How to Install 🚀 ## How to Install 🚀
🌟 **Important Note on User Roles:** 🌟 **Important Note on User Roles and Privacy:**
- **Admin Creation:** The very first account to sign up on the Ollama Web UI will be granted **Administrator privileges**. This account will have comprehensive control over the platform, including user management and system settings. - **Admin Creation:** The very first account to sign up on the Ollama Web UI will be granted **Administrator privileges**. This account will have comprehensive control over the platform, including user management and system settings.
- **User Registrations:** All subsequent users signing up will initially have their accounts set to **Pending** status by default. These accounts will require approval from the Administrator to gain access to the platform functionalities. - **User Registrations:** All subsequent users signing up will initially have their accounts set to **Pending** status by default. These accounts will require approval from the Administrator to gain access to the platform functionalities.
- **Privacy and Data Security:** We prioritize your privacy and data security above all. Please be reassured that all data entered into the Ollama Web UI is stored locally on your device. Our system is designed to be privacy-first, ensuring that no external requests are made, and your data does not leave your local environment. We are committed to maintaining the highest standards of data privacy and security, ensuring that your information remains confidential and under your control.
### Installing Both Ollama and Ollama Web UI Using Docker Compose ### Installing Both Ollama and Ollama Web UI Using Docker Compose
If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command: If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:

View file

@ -1,30 +1,32 @@
# Ollama Web UI Troubleshooting Guide # Ollama Web UI Troubleshooting Guide
## Understanding the Ollama WebUI Architecture
The Ollama WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
- **How it Works**: When you make a request (like `/ollama/api/tags`) from the Ollama WebUI, it doesnt go directly to the Ollama API. Instead, it first reaches the Ollama WebUI backend. The backend then forwards this request to the Ollama API via the route you define in the `OLLAMA_API_BASE_URL` environment variable. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_API_BASE_URL/tags` in the backend.
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
## Ollama WebUI: Server Connection Error ## Ollama WebUI: Server Connection Error
If you're running ollama-webui and have chosen to install webui and ollama separately, you might encounter connection issues. This is often due to the docker container being unable to reach the Ollama server at 127.0.0.1:11434(host.docker.internal:11434). To resolve this, you can use the `--network=host` flag in the docker command. When done so port would be changed from 3000 to 8080, and the link would be: http://localhost:8080. If you're experiencing connection issues, its often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
Here's an example of the command you should run: **Example Docker Command**:
```bash ```bash
docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
``` ```
## Connection Errors ### General Connection Errors
Make sure you have the **latest version of Ollama** installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/). **Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.ai/) for the latest updates.
If you encounter difficulties connecting to the Ollama server, please follow these steps to diagnose and resolve the issue: **Troubleshooting Steps**:
**1. Check Ollama URL Format** 1. **Verify Ollama URL Format**:
- When running the Web UI container, ensure the `OLLAMA_API_BASE_URL` is correctly set, including the `/api` suffix. (e.g., `http://192.168.1.1:11434/api` for different host setups).
- In the Ollama WebUI, navigate to "Settings" > "General".
- Confirm that the Ollama Server URL is correctly set to `/ollama/api`, including the `/api` suffix.
Ensure that the Ollama URL is correctly formatted in the application settings. Follow these steps: By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
- If your Ollama runs in a different host than Web UI make sure Ollama host address is provided when running Web UI container via `OLLAMA_API_BASE_URL` environment variable. [(e.g. OLLAMA_API_BASE_URL=http://192.168.1.1:11434/api)](https://github.com/ollama-webui/ollama-webui#accessing-external-ollama-on-a-different-server)
- Go to "Settings" within the Ollama WebUI.
- Navigate to the "General" section.
- Verify that the Ollama Server URL is set to: `/ollama/api`.
It is crucial to include the `/api` at the end of the URL to ensure that the Ollama Web UI can communicate with the server.
By following these troubleshooting steps, you should be able to identify and resolve connection issues with your Ollama server configuration. If you require further assistance or have additional questions, please don't hesitate to reach out or refer to our documentation for comprehensive guidance.

View file

@ -999,12 +999,12 @@
<hr class=" dark:border-gray-700" /> <hr class=" dark:border-gray-700" />
<div> <div>
<div class=" mb-2.5 text-sm font-medium">Ollama Server URL</div> <div class=" mb-2.5 text-sm font-medium">Ollama API URL</div>
<div class="flex w-full"> <div class="flex w-full">
<div class="flex-1 mr-2"> <div class="flex-1 mr-2">
<input <input
class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none" class="w-full rounded py-2 px-4 text-sm dark:text-gray-300 dark:bg-gray-800 outline-none"
placeholder="Enter URL (e.g. http://localhost:11434/api)" placeholder="Enter URL (e.g. http://localhost:8080/ollama/api)"
bind:value={API_BASE_URL} bind:value={API_BASE_URL}
/> />
</div> </div>
@ -1030,7 +1030,10 @@
</div> </div>
<div class="mt-2 text-xs text-gray-400 dark:text-gray-500"> <div class="mt-2 text-xs text-gray-400 dark:text-gray-500">
Trouble accessing Ollama? <a The field above should be set to <span
class=" text-gray-500 dark:text-gray-300 font-medium">'/ollama/api'</span
>;
<a
class=" text-gray-500 dark:text-gray-300 font-medium" class=" text-gray-500 dark:text-gray-300 font-medium"
href="https://github.com/ollama-webui/ollama-webui#troubleshooting" href="https://github.com/ollama-webui/ollama-webui#troubleshooting"
target="_blank" target="_blank"

View file

@ -249,6 +249,17 @@
<br class=" hidden sm:flex" />(version <br class=" hidden sm:flex" />(version
<span class=" dark:text-white font-medium">{REQUIRED_OLLAMA_VERSION} or higher</span <span class=" dark:text-white font-medium">{REQUIRED_OLLAMA_VERSION} or higher</span
>) or check your connection. >) or check your connection.
<div class="mt-1 text-sm">
Trouble accessing Ollama?
<a
class=" text-black dark:text-white font-semibold underline"
href="https://github.com/ollama-webui/ollama-webui#troubleshooting"
target="_blank"
>
Click here for help.
</a>
</div>
</div> </div>
<div class=" mt-6 mx-auto relative group w-fit"> <div class=" mt-6 mx-auto relative group w-fit">