forked from open-webui/open-webui
Merge branch 'main' into dev
This commit is contained in:
commit
eadbfeb277
4 changed files with 34 additions and 14 deletions
16
README.md
16
README.md
|
@ -45,7 +45,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
|
|||
|
||||
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
|
||||
|
||||
- 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience.
|
||||
- 🤝 **OpenAI API Integration**: Effortlessly integrate OpenAI-compatible API for versatile conversations alongside Ollama models. Customize the API Base URL to link with **LMStudio, Mistral, OpenRouter, and more**.
|
||||
|
||||
- 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history.
|
||||
|
||||
|
@ -79,7 +79,19 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose
|
|||
docker compose up -d --build
|
||||
```
|
||||
|
||||
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed.
|
||||
This command will install both Ollama and Ollama Web UI on your system.
|
||||
|
||||
#### Enable GPU
|
||||
Use the additional Docker Compose file designed to enable GPU support by running the following command:
|
||||
```bash
|
||||
docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d --build
|
||||
```
|
||||
|
||||
#### Expose Ollama API outside the container stack
|
||||
Deploy the service with an additional Docker Compose file designed for API exposure:
|
||||
```bash
|
||||
docker compose -f docker-compose.yml -f docker-compose.api.yml up -d --build
|
||||
```
|
||||
|
||||
### Installing Ollama Web UI Only
|
||||
|
||||
|
|
7
docker-compose.api.yml
Normal file
7
docker-compose.api.yml
Normal file
|
@ -0,0 +1,7 @@
|
|||
version: '3.6'
|
||||
|
||||
services:
|
||||
ollama:
|
||||
# Expose Ollama API outside the container stack
|
||||
ports:
|
||||
- 11434:11434
|
13
docker-compose.gpu.yml
Normal file
13
docker-compose.gpu.yml
Normal file
|
@ -0,0 +1,13 @@
|
|||
version: '3.6'
|
||||
|
||||
services:
|
||||
ollama:
|
||||
# GPU support
|
||||
deploy:
|
||||
resources:
|
||||
reservations:
|
||||
devices:
|
||||
- driver: nvidia
|
||||
count: 1
|
||||
capabilities:
|
||||
- gpu
|
|
@ -2,20 +2,8 @@ version: '3.6'
|
|||
|
||||
services:
|
||||
ollama:
|
||||
# Uncomment below for GPU support
|
||||
# deploy:
|
||||
# resources:
|
||||
# reservations:
|
||||
# devices:
|
||||
# - driver: nvidia
|
||||
# count: 1
|
||||
# capabilities:
|
||||
# - gpu
|
||||
volumes:
|
||||
- ollama:/root/.ollama
|
||||
# Uncomment below to expose Ollama API outside the container stack
|
||||
# ports:
|
||||
# - 11434:11434
|
||||
container_name: ollama
|
||||
pull_policy: always
|
||||
tty: true
|
||||
|
|
Loading…
Reference in a new issue