Open WebUI

From MuHack
Revision as of 17:40, 27 May 2025 by Mrmoddom (talk | contribs) (Created Open WebUI page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Open WebUI

Open WebUI is a self-hosted, browser-based user interface for accessing and interacting with large language models (LLMs) and AI tools running on MuHack’s internal GPU server. The project is designed to provide a clean, secure, and responsive interface for experimentation and development with AI models.

Overview

  • Project Name: Open WebUI
  • Hosted on: GPU Server
  • Access: by Tailscale VPN
  • Purpose: To provide a lightweight, fast and extensible interface for interacting with local or remote LLMs from a web browser.

Access Restrictions

Access to Open WebUI is strictly limited to internal members of MuHack. To connect:

  1. Establish a secure connection via our internal VPN.
  2. Visit the internal address: `http://open-webui.muhack/`

For security and bandwidth reasons, the interface is **not exposed to the public internet**.

Features

  • Chat interface with markdown and code formatting
  • Token and GPU usage tracking
  • Multiple model backends supported (e.g. LLaMA, Mistral, GGUF)
  • Local embeddings & vector search
  • Session persistence
  • Role-based access (using MuHack SSO)

Current Status

Check the current service status on [status page]

Related Projects

Contact

For access or troubleshooting, contact: