This guide deploys ZeroClaw with Docker Compose on Debian 10 to latest stable, Ubuntu LTS, and RHEL 9+ compatible hosts. ZeroClaw is an ultra-lightweight Rust AI agent runtime designed for edge devices and resource-constrained environments.
Official Resources:
- name: Deploy ZeroClaw
hosts: zeroclaw
become: true
vars:
app_root: /opt/zeroclaw
app_port: 3000
# Verify image name from upstream repository
zeroclaw_image: zeroclaw:latest
# Configure your provider and API key
zeroclaw_provider: openai
zeroclaw_api_key: "{{ vault_zeroclaw_api_key }}"
zeroclaw_channels: ['cli']
tasks:
- name: Install Docker on Debian/Ubuntu
apt:
name:
- docker.io
- docker-compose-plugin
state: present
update_cache: true
when: ansible_os_family == "Debian"
- name: Install Docker on RHEL family
dnf:
name:
- docker
- docker-compose-plugin
state: present
when: ansible_os_family == "RedHat"
- name: Enable and start Docker
service:
name: docker
state: started
enabled: true
- name: Create app directory
file:
path: "{{ app_root }}"
state: directory
mode: "0755"
- name: Create config directory
file:
path: "{{ app_root }}/.zeroclaw"
state: directory
mode: "0700"
- name: Write ZeroClaw configuration
copy:
dest: "{{ app_root }}/.zeroclaw/config.toml"
mode: "0600"
content: |
api_key = '{{ zeroclaw_api_key }}'
provider = '{{ zeroclaw_provider }}'
channels = {{ zeroclaw_channels | to_json }}
workspace_only = true
log_level = 'info'
- name: Write Docker Compose file
copy:
dest: "{{ app_root }}/docker-compose.yml"
mode: "0644"
content: |
services:
app:
image: {{ zeroclaw_image }}
container_name: zeroclaw
restart: unless-stopped
ports:
- "127.0.0.1:{{ app_port }}:3000"
volumes:
- ./data:/root/.zeroclaw
- ./.zeroclaw/config.toml:/root/.zeroclaw/config.toml:ro
read_only: true
tmpfs:
- /tmp
security_opt:
- no-new-privileges:true
cap_drop:
- ALL
cap_add:
- NET_BIND_SERVICE
deploy:
resources:
limits:
cpus: '1.0'
memory: 256M
- name: Start application stack
command: docker compose up -d
args:
chdir: "{{ app_root }}"
- name: Verify container is running
command: docker ps --filter name=zeroclaw --format "{{ '{{' }}.Names{{ '}}' }}"
register: container_status
changed_when: false
failed_when: "'zeroclaw' not in container_status.stdout"
[zeroclaw]
server1.example.com
server2.example.com
[zeroclaw:vars]
ansible_user=deploy
ansible_ssh_private_key_file=~/.ssh/id_ed25519
Store sensitive API keys in Ansible Vault:
# Create vault file
ansible-vault create group_vars/zeroclaw/vault.yml
# Add your API key
vault_zeroclaw_api_key: "sk-your-actual-api-key-here"
# Run playbook with vault
ansible-playbook -i inventory zeroclaw.yml --ask-vault-pass
~/.zeroclaw/config.toml~/.zeroclaw/.secret_key| Variable | Description | Default |
|---|---|---|
app_root |
Installation directory | /opt/zeroclaw |
app_port |
Exposed port | 3000 |
zeroclaw_image |
Container image reference | zeroclaw:latest |
zeroclaw_provider |
AI provider (openai, anthropic, ollama, openrouter, llama.cpp, vllm, osaurus, codex) | openai |
zeroclaw_api_key |
Provider API key (use Ansible Vault) | Required |
zeroclaw_channels |
List of channels to enable | ['cli'] |
| Provider | Config Value | Notes |
|---|---|---|
| OpenAI | openai |
GPT-4, GPT-3.5-turbo |
| Anthropic | anthropic |
Claude models |
| OpenRouter | openrouter |
Multi-provider aggregator |
| Ollama | ollama |
Local models |
| llama.cpp | llama.cpp |
Local llama-server |
| vLLM | vllm |
Local vLLM server |
| Osaurus | osaurus |
macOS edge runtime |
| Custom | custom |
OpenAI-compatible endpoints |
# Check container status
ansible zeroclaw -a "docker ps --filter name=zeroclaw"
# View logs
ansible zeroclaw -a "docker logs zeroclaw --tail 50"
# Test configuration
ansible zeroclaw -a "docker exec zeroclaw zeroclaw test --config /root/.zeroclaw/config.toml"
Any questions?
Feel free to contact us. Find all contact information on our contact page.