diff --git a/README.md b/README.md index de950ef8..b3e407f0 100644 --- a/README.md +++ b/README.md @@ -45,7 +45,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c - ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel. -- 🤝 **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience. +- 🤝 **OpenAI API Integration**: Effortlessly integrate OpenAI-compatible API for versatile conversations alongside Ollama models. Customize the API Base URL to link with **LMStudio, Mistral, OpenRouter, and more**. - 🔄 **Regeneration History Access**: Easily revisit and explore your entire regeneration history. @@ -79,7 +79,19 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose docker compose up -d --build ``` -This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed. +This command will install both Ollama and Ollama Web UI on your system. + +#### Enable GPU +Use the additional Docker Compose file designed to enable GPU support by running the following command: +```bash +docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d --build +``` + +#### Expose Ollama API outside the container stack +Deploy the service with an additional Docker Compose file designed for API exposure: +```bash +docker compose -f docker-compose.yml -f docker-compose.api.yml up -d --build +``` ### Installing Ollama Web UI Only diff --git a/docker-compose.api.yml b/docker-compose.api.yml new file mode 100644 index 00000000..c36cf11e --- /dev/null +++ b/docker-compose.api.yml @@ -0,0 +1,7 @@ +version: '3.6' + +services: + ollama: + # Expose Ollama API outside the container stack + ports: + - 11434:11434 \ No newline at end of file diff --git a/docker-compose.gpu.yml b/docker-compose.gpu.yml new file mode 100644 index 00000000..db47ae13 --- /dev/null +++ b/docker-compose.gpu.yml @@ -0,0 +1,13 @@ +version: '3.6' + +services: + ollama: + # GPU support + deploy: + resources: + reservations: + devices: + - driver: nvidia + count: 1 + capabilities: + - gpu diff --git a/docker-compose.yml b/docker-compose.yml index b5036354..427f8580 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -2,20 +2,8 @@ version: '3.6' services: ollama: - # Uncomment below for GPU support - # deploy: - # resources: - # reservations: - # devices: - # - driver: nvidia - # count: 1 - # capabilities: - # - gpu volumes: - ollama:/root/.ollama - # Uncomment below to expose Ollama API outside the container stack - # ports: - # - 11434:11434 container_name: ollama pull_policy: always tty: true