(Reskinned) User-friendly WebUI for LLMs (Formerly Ollama WebUI) https://chat.depeuter.dev
ai
Find a file
Timothy Jaeryang Baek 457e93b47f
Update README.md
2023-10-11 23:29:54 -07:00
src minor ui update 2023-10-11 21:43:42 -07:00
static ollama placeholder added 2023-10-08 15:58:33 -07:00
.eslintignore chat feature added 2023-10-08 15:38:42 -07:00
.eslintrc.cjs chat feature added 2023-10-08 15:38:42 -07:00
.gitignore chat feature added 2023-10-08 15:38:42 -07:00
.npmrc chat feature added 2023-10-08 15:38:42 -07:00
.prettierignore chat feature added 2023-10-08 15:38:42 -07:00
.prettierrc chat feature added 2023-10-08 15:38:42 -07:00
demo.gif Update demo.gif 2023-10-11 21:52:44 -07:00
Dockerfile chat feature added 2023-10-08 15:38:42 -07:00
LICENSE Update LICENSE 2023-10-08 17:42:48 -05:00
package-lock.json chat feature added 2023-10-08 15:38:42 -07:00
package.json endpoint address updated 2023-10-08 18:32:54 -07:00
postcss.config.js chat feature added 2023-10-08 15:38:42 -07:00
README.md Update README.md 2023-10-11 23:29:54 -07:00
run.sh chat feature added 2023-10-08 15:38:42 -07:00
svelte.config.js chat feature added 2023-10-08 15:38:42 -07:00
tailwind.config.js chat feature added 2023-10-08 15:38:42 -07:00
tsconfig.json chat feature added 2023-10-08 15:38:42 -07:00
vite.config.ts chat feature added 2023-10-08 15:38:42 -07:00

Ollama Web UI 👋

ChatGPT-Style Web Interface for Ollama 🦙

Ollama Web UI Demo

Features

  • 🖥️ Intuitive Interface: Our chat interface takes inspiration from ChatGPT, ensuring a user-friendly experience.
  • 📱 Responsive Design: Enjoy a seamless experience on both desktop and mobile devices.
  • Swift Responsiveness: Enjoy fast and responsive performance.
  • 🚀 Effortless Setup: Install seamlessly using Docker for a hassle-free experience.
  • 🤖 Multiple Model Support: Seamlessly switch between different chat models for diverse interactions.
  • 🌟 Continuous Updates: We are committed to improving Ollama Web UI with regular updates and new features.

How to Install 🚀

Using Docker 🐳

docker build -t ollama-webui .
docker run -d -p 3000:3000 --name ollama-webui --restart always ollama-webui

Your Ollama Web UI should now be hosted at http://localhost:3000. Enjoy! 😄

What's Next? 🚀

To-Do List 📝

Here are some exciting tasks on our to-do list:

  • 📜 Chat History: Effortlessly access and manage your conversation history.
  • 📤📥 Import/Export Chat History: Seamlessly move your chat data in and out of the platform.
  • 🎨 Customization: Tailor your chat environment with personalized themes and styles.
  • 📥🗑️ Download/Delete Models: Easily acquire or remove models directly from the web UI.
  • ⚙️ Advanced Parameters Support: Harness the power of advanced settings for fine-tuned control.
  • 📚 Enhanced Documentation: Elevate your setup and customization experience with improved, comprehensive documentation.
  • 🌟 User Interface Enhancement: Elevate the user interface to deliver a smoother, more enjoyable interaction.
  • 🚀 Integration with Messaging Platforms: Explore possibilities for integrating with popular messaging platforms like Slack and Discord.
  • 🧐 User Testing and Feedback Gathering: Conduct thorough user testing to gather insights and refine our offerings based on valuable user feedback.

Feel free to contribute and help us make Ollama Web UI even better! 🙌

Contributors

A big shoutout to our amazing contributors who have helped make this project possible! 🙏

License 📜

This project is licensed under the MIT License - see the LICENSE file for details. 📄

Support 💬

If you have any questions, suggestions, or need assistance, please open an issue or join our Discord community to connect with us! 🤝


Let's make Ollama Web UI even more amazing together! 💪