Merge branch 'main' into dev

This commit is contained in:
Timothy Jaeryang Baek 2023-12-24 15:02:54 -05:00 committed by GitHub
commit eadbfeb277
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
4 changed files with 34 additions and 14 deletions

View file

@ -45,7 +45,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel. - ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
- 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience. - 🤝 **OpenAI API Integration**: Effortlessly integrate OpenAI-compatible API for versatile conversations alongside Ollama models. Customize the API Base URL to link with **LMStudio, Mistral, OpenRouter, and more**.
- 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history. - 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history.
@ -79,7 +79,19 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose
docker compose up -d --build docker compose up -d --build
``` ```
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed. This command will install both Ollama and Ollama Web UI on your system.
#### Enable GPU
Use the additional Docker Compose file designed to enable GPU support by running the following command:
```bash
docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d --build
```
#### Expose Ollama API outside the container stack
Deploy the service with an additional Docker Compose file designed for API exposure:
```bash
docker compose -f docker-compose.yml -f docker-compose.api.yml up -d --build
```
### Installing Ollama Web UI Only ### Installing Ollama Web UI Only

7
docker-compose.api.yml Normal file
View file

@ -0,0 +1,7 @@
version: '3.6'
services:
ollama:
# Expose Ollama API outside the container stack
ports:
- 11434:11434

13
docker-compose.gpu.yml Normal file
View file

@ -0,0 +1,13 @@
version: '3.6'
services:
ollama:
# GPU support
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu

View file

@ -2,20 +2,8 @@ version: '3.6'
services: services:
ollama: ollama:
# Uncomment below for GPU support
# deploy:
# resources:
# reservations:
# devices:
# - driver: nvidia
# count: 1
# capabilities:
# - gpu
volumes: volumes:
- ollama:/root/.ollama - ollama:/root/.ollama
# Uncomment below to expose Ollama API outside the container stack
# ports:
# - 11434:11434
container_name: ollama container_name: ollama
pull_policy: always pull_policy: always
tty: true tty: true