LLM Harbor is a CLI tool and companion application for managing local LLM stacks. The project was created by developer av (av@av.codes) with the goal of simplifying local LLM infrastructure management through a single command interface.
Harbor was designed to solve a common pain point in the GenAI ecosystem: the complexity of setting up and managing multiple LLM-related services. Before Harbor, users needed to manually configure Docker Compose files, manage service networking, and handle model downloads across different backends. Harbor automates this by pre-wiring services together behind a unified CLI.
The project follows a “local-first” philosophy, emphasizing:
Harbor began as a tool for personal LLM experimentation, focusing on the core workflow of starting services with minimal configuration. The initial version supported basic Docker Compose orchestration for Ollama and Open WebUI.
As the GenAI ecosystem grew, Harbor expanded to support additional services:
Later versions introduced configuration profiles, allowing users to save and switch between different service setups. This addressed the need for reproducible environments across teams and use cases.
Harbor added unified model management with shared caches across services, reducing disk usage and download times. Integration with HuggingFace and Ollama model repositories provides centralized model handling.
Version: 0.3.34 (as of January 2026)
Development Status: Active development (v0.x)
License: Apache-2.0
Distribution:
llm-harbor@avcodes/harborCommunity:
Harbor’s architecture reflects common patterns in the GenAI tooling ecosystem:
harbor eject allows migration to standalone Compose filesHarbor now supports 50+ services across categories:
Frontends (UIs): Open WebUI, ComfyUI, LibreChat, ChatUI, Lobe Chat, Hollama, parllama, BionicGPT, AnythingLLM, and more
Backends: Ollama, llama.cpp, vLLM, TabbyAPI, Aphrodite Engine, mistral.rs, SGLang, KTransformers, KoboldCpp, text-generation-inference, and more
Satellites: SearXNG, Perplexica, Dify, LiteLLM, n8n, LangFlow, Flowise, Qdrant, Traefik, Open Interpreter, AutoGPT, JupyterLab, and 40+ more
The project continues to evolve with the GenAI ecosystem, with ongoing work in: