From 836babdef44eb26e3dba09fe0ee24ce6413806ab Mon Sep 17 00:00:00 2001 From: cloudXabide Date: Sat, 2 Dec 2023 08:39:21 -0500 Subject: [PATCH 1/5] Added verbiage for running as container on Apple Silicon --- TROUBLESHOOTING.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index db9f1582..54cfce40 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -25,3 +25,20 @@ Ensure that the Ollama URL is correctly formatted in the application settings. F It is crucial to include the `/api` at the end of the URL to ensure that the Ollama Web UI can communicate with the server. By following these troubleshooting steps, you should be able to identify and resolve connection issues with your Ollama server configuration. If you require further assistance or have additional questions, please don't hesitate to reach out or refer to our documentation for comprehensive guidance. + +## Running ollama-webui as a cintainer on Apple Silicon Mac + +If you are running Docker on a M{1..3} based Mac and have taken the steps to run an x86 container, add "--platform linux/amd64" to the docker run command. +Example: +```bash +docker run -d -p 3000:8080 --env-file=$OLLAMA_ENV_FILE --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +``` +Becomes +``` +docker run -it --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://10.10.10.20:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main + +``` + +## References +[Change Docker Desktop Settings on Mac](https://docs.docker.com/desktop/settings/mac/) Search for "x86" in that page. +[Run x86 (Intel) and ARM based images on Apple Silicon (M1) Macs?](https://forums.docker.com/t/run-x86-intel-and-arm-based-images-on-apple-silicon-m1-macs/117123) From 6eadfb8a734129f436e7c2fce9b6455dbdfcd8a5 Mon Sep 17 00:00:00 2001 From: James Radtke <47249757+cloudxabide@users.noreply.github.com> Date: Sat, 2 Dec 2023 14:47:05 -0500 Subject: [PATCH 2/5] Update TROUBLESHOOTING.md Made style/formatting changes suggested in commit conversation. --- TROUBLESHOOTING.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index 54cfce40..38c32711 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -31,11 +31,11 @@ By following these troubleshooting steps, you should be able to identify and res If you are running Docker on a M{1..3} based Mac and have taken the steps to run an x86 container, add "--platform linux/amd64" to the docker run command. Example: ```bash -docker run -d -p 3000:8080 --env-file=$OLLAMA_ENV_FILE --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +docker run -d -p 3000:8080 --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` Becomes ``` -docker run -it --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://10.10.10.20:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +docker run -it --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` From 1ce0356860fb049c0c1de386301ea4c676f8b0d6 Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Mon, 4 Dec 2023 03:31:36 -0500 Subject: [PATCH 3/5] Update TROUBLESHOOTING.md --- TROUBLESHOOTING.md | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index 38c32711..fa6cd83d 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -26,17 +26,20 @@ It is crucial to include the `/api` at the end of the URL to ensure that the Oll By following these troubleshooting steps, you should be able to identify and resolve connection issues with your Ollama server configuration. If you require further assistance or have additional questions, please don't hesitate to reach out or refer to our documentation for comprehensive guidance. -## Running ollama-webui as a cintainer on Apple Silicon Mac +## Running ollama-webui as a container on Apple Silicon Mac + +If you are running Docker on a M{1..3} based Mac and have taken the steps to run an x86 container, add "--platform linux/amd64" to the docker run command to prevent a warning. -If you are running Docker on a M{1..3} based Mac and have taken the steps to run an x86 container, add "--platform linux/amd64" to the docker run command. Example: + ```bash docker run -d -p 3000:8080 --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` + Becomes + ``` docker run -it --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main - ``` ## References From e5a1419d19ba369aafde4a784d573668c8d550e5 Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Mon, 4 Dec 2023 03:32:31 -0500 Subject: [PATCH 4/5] Update TROUBLESHOOTING.md --- TROUBLESHOOTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index fa6cd83d..d5799397 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -33,7 +33,7 @@ If you are running Docker on a M{1..3} based Mac and have taken the steps to run Example: ```bash -docker run -d -p 3000:8080 --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` Becomes From e5ffa149cf7ce9229d48ed873a73d7f520405465 Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Mon, 4 Dec 2023 03:33:24 -0500 Subject: [PATCH 5/5] Update TROUBLESHOOTING.md --- TROUBLESHOOTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index d5799397..0b5536c5 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -39,7 +39,7 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api - Becomes ``` -docker run -it --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main +docker run --platform linux/amd64 -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://example.com:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` ## References