Merge branch 'main' into dev

This commit is contained in:
Timothy Jaeryang Baek 2023-12-14 04:12:10 -05:00 committed by GitHub
commit d4f783cbb0
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 24 additions and 18 deletions

View file

@ -20,9 +20,10 @@ This configuration allows Ollama to accept connections from any source.
Ensure that the Ollama URL is correctly formatted in the application settings. Follow these steps: Ensure that the Ollama URL is correctly formatted in the application settings. Follow these steps:
- If your Ollama runs in a different host than Web UI make sure Ollama host address is provided when running Web UI container via `OLLAMA_API_BASE_URL` environment variable. [(e.g. OLLAMA_API_BASE_URL=http://192.168.1.1:11434/api)](https://github.com/ollama-webui/ollama-webui#accessing-external-ollama-on-a-different-server)
- Go to "Settings" within the Ollama WebUI. - Go to "Settings" within the Ollama WebUI.
- Navigate to the "General" section. - Navigate to the "General" section.
- Verify that the Ollama URL is in the following format: `http://localhost:11434/api`. - Verify that the Ollama Server URL is set to: `/ollama/api`.
It is crucial to include the `/api` at the end of the URL to ensure that the Ollama Web UI can communicate with the server. It is crucial to include the `/api` at the end of the URL to ensure that the Ollama Web UI can communicate with the server.

View file

@ -59,6 +59,7 @@ def proxy(path):
else: else:
pass pass
try:
# Make a request to the target server # Make a request to the target server
target_response = requests.request( target_response = requests.request(
method=request.method, method=request.method,
@ -68,6 +69,8 @@ def proxy(path):
stream=True, # Enable streaming for server-sent events stream=True, # Enable streaming for server-sent events
) )
target_response.raise_for_status()
# Proxy the target server's response to the client # Proxy the target server's response to the client
def generate(): def generate():
for chunk in target_response.iter_content(chunk_size=8192): for chunk in target_response.iter_content(chunk_size=8192):
@ -80,6 +83,8 @@ def proxy(path):
response.headers[key] = value response.headers[key] = value
return response return response
except Exception as e:
return jsonify({"detail": "Server Connection Error", "message": str(e)}), 400
if __name__ == "__main__": if __name__ == "__main__":