Open WebUI

Open WebUI is an extensible, feature-rich and user-friendly WebUI for LLMs.


GPUs are coming to Koyeb! Get ready to deploy serverless AI apps on high-performance infrastructure. Join the preview.

Best GPUs

Want to make your private ChatGPT blazing fast? Deploy AI workloads on the best GPU Instances today!

Request GPUs

Overview

Open WebUI is a ChatGPT-like web UI for various LLM runners, including Ollama and other OpenAI-compatible APIs.

It's hard to name all of the features supported by Open WebUI, but to name a few:

  • πŸ“š RAG integration: Interact with your internal knowledge base by importing documents directly into the chat.
  • 🌐 Web Browsing Capability: Integrate websites into your experience.
  • πŸ—£οΈ Voice Input Support: Engage with your model through voice interactions and enjoy the convenience of talking to your model directly.
  • πŸŽ¨πŸ€– Image Generation Integration: Seamlessly incorporate image generation capabilities using options such as AUTOMATIC1111 API, OpenAI DALLΒ·E and others.
  • πŸ‘πŸ‘Ž RLHF Annotation: Empower your messages by rating them with thumbs up and thumbs down, followed by the option to provide textual feedback, facilitating the creation of datasets for Reinforcement Learning from Human Feedback (RLHF).

Open WebUI also offers great flexibility and configurability, so in practice, you can have your own private ChatGPT suited to your needs.

Try it out

This example app deploys Open WebUI together with Ollama, which simplifies running the latest state-of-the-art models.

Once the Open WebUI server is deployed, you can start interacting with it via your Koyeb App URL similar to: https://<YOUR_APP_NAME>-<YOUR_KOYEB_ORG>.koyeb.app.

Pull the model you want from Ollama and start using your private ChatGPT lookalike!

Other resources related to WebUI

Related One-Click Apps in this category

  • DeepSparse Server

    DeepSparse is an inference runtime taking advantage of sparsity with neural networks offering GPU-class performance on CPUs.

  • LangServe

    LangServe makes it easy to deploy LangChain applications as RESTful APIs.

  • LlamaIndex

    LlamaIndex gives you the tools you need to build production-ready LLM applications from your organization's data.

The fastest way to deploy applications globally.