Onyx should be configured for connector scope control, retrieval policy, and enterprise auth integration.
POSTGRES_HOST=relational_db
POSTGRES_PORT=5432
POSTGRES_DB=onyx
POSTGRES_USER=postgres
POSTGRES_PASSWORD=replace-with-strong-password
# Base URL for Onyx instance
ONYX_BASE_URL=https://onyx.example.com
# Domain name (used for cookies and CORS)
DOMAIN=onyx.example.com
# Authentication type: basic, oauth, saml, oidc
AUTH_TYPE=basic
# Secret key for session management and encryption
AUTH_SECRET=replace-with-long-random-secret-min-32-chars
# Enable Onyx Craft (AI-powered web app builder)
ENABLE_CRAFT=false
# Vespa vector search engine
VESPA_HOST=index
VESPA_PORT=19071
# Redis cache
REDIS_HOST=cache
REDIS_PORT=6379
# Model servers
MODEL_SERVER_HOST=inference_model_server
INDEXING_MODEL_SERVER_HOST=indexing_model_server
DISABLE_MODEL_SERVER=false
FILE_STORE_BACKEND=s3
S3_ENDPOINT_URL=http://minio:9000
S3_AWS_ACCESS_KEY_ID=minioadmin
S3_AWS_SECRET_ACCESS_KEY=minioadmin
S3_BUCKET=onyx-files
NGINX_PROXY_CONNECT_TIMEOUT=300
NGINX_PROXY_SEND_TIMEOUT=300
NGINX_PROXY_READ_TIMEOUT=300
HOST_PORT_80=80
HOST_PORT=3000
Connector Permissions
Retrieval Policy
Agent Security
Authentication & Authorization
Configure LLM providers through the Onyx UI or environment:
| Provider | Configuration Location |
|---|---|
| OpenAI | API key in UI Settings |
| Anthropic | API key in UI Settings |
| Google Gemini | API key in UI Settings |
| Ollama | Base URL (e.g., http://ollama:11434) |
| vLLM | Base URL and model name |
| Azure OpenAI | Endpoint, deployment name, API key |
PostgreSQL Database
docker compose exec relational_db pg_dump -U postgres onyx > onyx-db-backup.sql
Vespa Index Data
vespa_volume Docker volumeConfiguration Files
HuggingFace Model Cache (optional, speeds up recovery)
model_cache_huggingface volumeAny questions?
Feel free to contact us. Find all contact information on our contact page.