Local Deep Research uses environment variables for deployment-specific settings and a web UI for runtime configuration. Most settings can be adjusted at http://localhost:5000/settings.
Warning: Setting environment variables causes a hard override—the setting becomes read-only in the UI and cannot be changed until the environment variable is removed.
# Data directory (Docker default: /data)
LDR_DATA_DIR=/srv/local-deep-research
# Allow unencrypted SQLite (skip SQLCipher) - set to "true" to disable encryption
LDR_ALLOW_UNENCRYPTED=false
# LLM Provider: ollama, openai, anthropic, google, openrouter, lmstudio
LDR_LLM_PROVIDER=ollama
# Ollama server URL (for local models)
LDR_LLM_OLLAMA_URL=http://localhost:11434
# Model name (e.g., gemma3:12b, llama3, mistral)
LDR_LLM_MODEL=gemma3:12b
# Allow new user registration (default: true)
LDR_APP_ALLOW_REGISTRATIONS=true
OpenAI:
LDR_LLM_PROVIDER=openai
LDR_LLM_OPENAI_API_KEY=sk-...
Anthropic (Claude):
LDR_LLM_PROVIDER=anthropic
LDR_LLM_ANTHROPIC_API_KEY=sk-ant-...
Google (Gemini):
LDR_LLM_PROVIDER=google
LDR_LLM_GOOGLE_API_KEY=...
OpenRouter (100+ models):
LDR_LLM_PROVIDER=openai_endpoint
LDR_LLM_OPENAI_ENDPOINT_URL=https://openrouter.ai/api/v1
LDR_LLM_OPENAI_ENDPOINT_API_KEY=your-openrouter-key
LDR_LLM_MODEL=anthropic/claude-3.5-sonnet
LM Studio (local):
LDR_LLM_PROVIDER=lmstudio
LDR_LLM_LMSTUDIO_URL=http://host.docker.internal:1234/v1
LDR_LLM_MODEL=your-local-model
Configure search engines via the web UI or environment variables:
# SearXNG instance URL
LDR_SEARCH_ENGINE_WEB_SEARXNG_DEFAULT_PARAMS_INSTANCE_URL=http://localhost:8080
# Tavily API (premium)
LDR_SEARCH_ENGINE_TAVILY_API_KEY=tvly-...
# Google SerpAPI (premium)
LDR_SEARCH_ENGINE_SERPAPI_API_KEY=...
# Brave Search (premium)
LDR_SEARCH_ENGINE_BRAVE_API_KEY=...
The official docker-compose.yml includes three services:
services:
local-deep-research:
image: localdeepresearch/local-deep-research:latest
ports:
- "5000:5000"
environment:
- LDR_DATA_DIR=/data
- LDR_LLM_OLLAMA_URL=http://ollama:11434
- LDR_SEARCH_ENGINE_WEB_SEARXNG_DEFAULT_PARAMS_INSTANCE_URL=http://searxng:8080
volumes:
- ldr_data:/data
depends_on:
ollama:
condition: service_healthy
searxng:
condition: service_started
ollama:
image: ollama/ollama:latest
environment:
- OLLAMA_KEEP_ALIVE=30m
volumes:
- ollama_data:/root/.ollama
searxng:
image: searxng/searxng:latest
volumes:
- searxng_data:/etc/searxng
volumes:
ldr_data:
ollama_data:
searxng_data:
What to back up:
ldr_data volume - Contains all user databases, API keys, research historyollama_data volume - Downloaded LLM models (can be re-downloaded)searxng_data volume - Search engine configurationRecovery steps:
LDR_ALLOW_UNENCRYPTED=false (default)| Type | Engines |
|---|---|
| Academic | arXiv, PubMed, Semantic Scholar |
| General | Wikipedia, SearXNG |
| Technical | GitHub, Elasticsearch |
| Historical | Wayback Machine |
| News | The Guardian, Wikinews |
| Premium | Tavily, Google (SerpAPI), Brave Search |
| Custom | Local Documents, LangChain Retrievers |
Any questions?
Feel free to contact us. Find all contact information on our contact page.