forked from open-webui/open-webui
doc: setup instructions updated
This commit is contained in:
parent
eff48d7e3f
commit
9ddde1f833
1 changed files with 13 additions and 11 deletions
24
README.md
24
README.md
|
@ -57,13 +57,9 @@ ChatGPT-Style Web Interface for Ollama 🦙
|
||||||
|
|
||||||
## How to Install 🚀
|
## How to Install 🚀
|
||||||
|
|
||||||
### Prerequisites
|
### Installing Both Ollama and Ollama Web UI Using Docker Compose
|
||||||
|
|
||||||
Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).
|
If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
|
||||||
|
|
||||||
#### Installing Both Ollama and Ollama Web UI Using Docker Compose
|
|
||||||
|
|
||||||
If you don't have Ollama installed, you can also use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker compose up --build
|
docker compose up --build
|
||||||
|
@ -71,13 +67,19 @@ docker compose up --build
|
||||||
|
|
||||||
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed.
|
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed.
|
||||||
|
|
||||||
#### Checking Ollama
|
### Installing Ollama Web UI Only
|
||||||
|
|
||||||
After installing, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
|
#### Prerequisites
|
||||||
|
|
||||||
### Using Docker 🐳
|
Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).
|
||||||
|
|
||||||
If Ollama is hosted on your local machine, run the following command:
|
##### Checking Ollama
|
||||||
|
|
||||||
|
After installing Ollama, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
|
||||||
|
|
||||||
|
#### Using Docker 🐳
|
||||||
|
|
||||||
|
If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
|
||||||
|
@ -92,7 +94,7 @@ docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name o
|
||||||
|
|
||||||
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄
|
Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄
|
||||||
|
|
||||||
### Accessing External Ollama on a Different Server
|
#### Accessing External Ollama on a Different Server
|
||||||
|
|
||||||
Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url:
|
Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url:
|
||||||
|
|
||||||
|
|
Loading…
Reference in a new issue