Merge pull request #116 from ollama-webui/dev

doc: docker compose updated
This commit is contained in:
Timothy Jaeryang Baek 2023-11-17 13:53:56 -05:00 committed by GitHub
commit 788cda75eb
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 11 additions and 6 deletions

View file

@ -33,7 +33,7 @@ ChatGPT-Style Web Interface for Ollama 🦙
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions. - 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
- ⚙️ **Many Models Conversations**: : Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel. - ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
- 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience. - 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience.
@ -62,10 +62,10 @@ ChatGPT-Style Web Interface for Ollama 🦙
If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command: If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
```bash ```bash
docker compose up --build docker compose up -d --build
``` ```
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed. This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed.
### Installing Ollama Web UI Only ### Installing Ollama Web UI Only

View file

@ -13,8 +13,9 @@ services:
# - gpu # - gpu
volumes: volumes:
- ollama:/root/.ollama - ollama:/root/.ollama
ports: # Uncomment below to expose Ollama API outside the container stack
- 11434:11434 # ports:
# - 11434:11434
container_name: ollama container_name: ollama
pull_policy: always pull_policy: always
tty: true tty: true
@ -29,8 +30,12 @@ services:
dockerfile: Dockerfile dockerfile: Dockerfile
image: ollama-webui:latest image: ollama-webui:latest
container_name: ollama-webui container_name: ollama-webui
depends_on:
- ollama
ports: ports:
- 3000:8080 - 3000:8080
environment:
- "OLLAMA_API_BASE_URL=http://ollama:11434/api"
extra_hosts: extra_hosts:
- host.docker.internal:host-gateway - host.docker.internal:host-gateway
restart: unless-stopped restart: unless-stopped