forked from open-webui/open-webui
doc: manual deployment instruction added
This commit is contained in:
parent
b3da9b0063
commit
6ca5a23bcb
1 changed files with 50 additions and 11 deletions
61
README.md
61
README.md
|
@ -115,43 +115,53 @@ docker build -t ollama-webui .
|
|||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name ollama-webui --restart always ollama-webui
|
||||
```
|
||||
|
||||
## How to Build for Static Deployment
|
||||
## How to Install Without Docker
|
||||
|
||||
1. Clone & Enter the project
|
||||
While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own.
|
||||
|
||||
### Project Components
|
||||
|
||||
The Ollama Web UI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Both need to be running concurrently for the development environment using `npm run dev`. Alternatively, you can set the `PUBLIC_API_BASE_URL` during the build process to have the frontend connect directly to your Ollama instance or build the frontend as static files and serve them with the backend.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
1. **Clone and Enter the Project:**
|
||||
|
||||
```sh
|
||||
git clone https://github.com/ollama-webui/ollama-webui.git
|
||||
pushd ./ollama-webui/
|
||||
cd ollama-webui/
|
||||
```
|
||||
|
||||
2. Create and edit `.env`
|
||||
2. **Create and Edit `.env`:**
|
||||
|
||||
```sh
|
||||
cp -RPp example.env .env
|
||||
```
|
||||
|
||||
3. Install node dependencies
|
||||
### Building Ollama Web UI Frontend
|
||||
|
||||
1. **Install Node Dependencies:**
|
||||
|
||||
```sh
|
||||
npm i
|
||||
npm install
|
||||
```
|
||||
|
||||
4. Run in dev mode, or build the site for deployment
|
||||
2. **Run in Dev Mode or Build for Deployment:**
|
||||
|
||||
- Test in Dev mode:
|
||||
- Dev Mode (requires the backend to be running simultaneously):
|
||||
|
||||
```sh
|
||||
npm run dev
|
||||
```
|
||||
|
||||
- Build for Deploy:
|
||||
- Build for Deployment:
|
||||
|
||||
```sh
|
||||
#`PUBLIC_API_BASE_URL` will overwrite the value in `.env`
|
||||
# `PUBLIC_API_BASE_URL` overwrites the value in `.env`
|
||||
PUBLIC_API_BASE_URL='https://example.com/api' npm run build
|
||||
```
|
||||
|
||||
5. Test the build with `caddy` (or the server of your choice)
|
||||
3. **Test the Build with `Caddy` (or your preferred server):**
|
||||
|
||||
```sh
|
||||
curl https://webi.sh/caddy | sh
|
||||
|
@ -160,6 +170,35 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name
|
|||
caddy run --envfile .env --config ./Caddyfile.localhost
|
||||
```
|
||||
|
||||
### Running Ollama Web UI Backend
|
||||
|
||||
If you wish to run the backend for deployment, ensure that the frontend is built so that the backend can serve the frontend files along with the API route.
|
||||
|
||||
#### Setup Instructions
|
||||
|
||||
1. **Install Python Requirements:**
|
||||
|
||||
```sh
|
||||
cd ./backend
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
2. **Run Python Backend:**
|
||||
|
||||
- Dev Mode with Hot Reloading:
|
||||
|
||||
```sh
|
||||
sh dev.sh
|
||||
```
|
||||
|
||||
- Deployment:
|
||||
|
||||
```sh
|
||||
sh start.sh
|
||||
```
|
||||
|
||||
Now, you should have the Ollama Web UI up and running at [http://localhost:8080/](http://localhost:8080/). Feel free to explore the features and functionalities of Ollama! If you encounter any issues, please refer to the instructions above or reach out to the community for assistance.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
See [TROUBLESHOOTING.md](/TROUBLESHOOTING.md) for information on how to troubleshoot and/or join our [Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s).
|
||||
|
|
Loading…
Reference in a new issue