ZeroClaw is a minimalist AI agent runtime implemented in Rust, designed for secure and efficient operation on low-power edge devices. Built from scratch as an independent project, it targets resource-constrained environments with a focus on minimal memory footprint and near-instant startup times.
Tagline: Zero overhead. Zero compromise. 100% Rust. 100% Agnostic.
Official Website: https://zeroclawlabs.ai
- Ultra-lightweight runtime - Common CLI and status workflows run in under 5 MB RAM (release builds)
- Near-instant startup - Cold starts under 10 ms on 0.8GHz cores (release builds)
- Small binary size - Approximately 8.8 MB static binary with no runtime dependencies
- 100% Rust - Compile-time memory safety guarantees with no heavyweight runtime dependencies
- Portable architecture - Single binary workflow across ARM, x86, and RISC-V
- Fully swappable - Core systems are traits (providers, channels, tools, memory, tunnels)
- Secure by design - Pairing, strict sandboxing, explicit allowlists, workspace scoping
- No lock-in - OpenAI-compatible provider support with pluggable custom endpoints
- Edge device deployments (Raspberry Pi, low-power SBCs, $10 development boards)
- High-density multi-tenant hosts
- CI/CD runners with minimal resource overhead
- IoT and embedded AI agent scenarios
- Low-resource VPS deployments
- Teams prioritizing efficiency over full feature parity
| Component |
Technology |
| Primary Language |
Rust (95% of codebase) |
| Secondary Languages |
TypeScript (2.5%), Shell (1.5%), Python (0.8%) |
| Architecture |
Trait-driven, modular design |
| Binary Type |
Static binary (no runtime dependencies) |
- Dual-licensed: MIT AND Apache 2.0
- MIT: Open-source, research, academic, personal use
- Apache 2.0: Patent protection, institutional, commercial deployment
- Contributors grant rights under both licenses (see CLA.md in repository)
- Active community project with 24.3k+ GitHub stars, 3.1k+ forks
- Latest release: v0.1.7 (February 24, 2026)
- Built by students and members of Harvard, MIT, and Sundai.Club communities
- Early-stage maturity (v0.1.x series)
- Zero documented CVEs as of latest release
ZeroClaw supports 9+ AI providers with swappable architecture:
| Provider |
Type |
Notes |
| OpenAI |
AI Model |
OpenAI-compatible endpoints |
| Anthropic |
AI Model |
Supports OAuth and API key auth |
| OpenRouter |
AI Model |
Aggregator (default in examples) |
| Ollama |
AI Model |
Local and remote endpoints |
| llama.cpp |
AI Model |
llama-server endpoint |
| vLLM |
AI Model |
vLLM server endpoint |
| Osaurus |
AI Model |
Unified AI edge runtime for macOS |
| OpenAI Codex |
AI Model |
ChatGPT subscription OAuth |
| Custom |
AI Model |
OpenAI-compatible or Anthropic-compatible endpoints |
ZeroClaw supports 17+ messaging platforms:
- CLI - Direct command-line interface (built-in)
- Telegram - Bot API with operator-approval flow
- Discord - User ID allowlists
- Slack - Member ID allowlists
- Mattermost - API v4, user ID allowlists
- iMessage - Native macOS integration
- Matrix - E2EE support available
- Signal - Encrypted messaging
- WhatsApp - Web mode + Business Cloud API mode
- Email - SMTP/IMAP integration
- IRC - Classic chat protocol
- Lark - Enterprise messaging
- DingTalk - Alibaba enterprise messaging
- QQ - Tencent messaging platform
- Nostr - NIP-04 and NIP-17 DMs support
- Webhook - HTTP webhook endpoint
- Linq - Integration platform
70+ integrations available via plugin system across 9 categories.
¶ History and References
- ZeroClaw History
- ZeroClaw Links
- ZeroClaw Alternatives
- Official GitHub: https://github.com/zeroclaw-labs/zeroclaw
- Official Website: https://www.zeroclawlabs.ai/
- ⚠️ Warning: Not affiliated with
openagen/zeroclaw, zeroclaw.org, or zeroclaw.net - those domains point to unauthorized forks
Any questions?
Feel free to contact us. Find all contact information on our contact page.