Local Deep Research (LDR) is a self-hosted research assistant that performs multi-step investigations across web sources, papers, and private documents. It emphasizes transparency and citations, helping users trace where conclusions come from. LDR bundles its dependencies through Docker Compose so you can run the full stack locally. It is designed for researchers, students, and analysts who need repeatable research workflows on private infrastructure.
- Research Modes: Quick Summary (30s-3min), Detailed Research (5-15min), Report Generation (10-30min), Document Analysis
- Multi-source search: 20+ search engines (arXiv, PubMed, SearXNG, GitHub, Wikipedia, Tavily, Google)
- SQLCipher encryption: AES-256 per-user isolated databases (Signal-level security)
- 100+ LLM support: Ollama (local), OpenAI, Anthropic, Google, OpenRouter
- Analytics dashboard: Track costs, performance, and usage metrics
- News subscriptions: Automated research digests with customizable frequency
- Export options: PDF and Markdown export
- REST API: Authenticated HTTP access with per-user databases
- MCP server: Claude Desktop and Claude Code integration
- Research summarization
- Academic literature discovery
- Private deep research on internal docs
- Automated news monitoring
- Document analysis and retrieval
- Language: Python (81%), JavaScript (14%)
- Core Libraries: LangChain, FAISS
- Database: SQLCipher (AES-256 encrypted SQLite)
- Search Backend: SearXNG
- Local LLM Runtime: Ollama, LlamaCpp, vLLM (optional)
- Deployment: Docker, Docker Compose
- Open-source and self-hosted
- Active development (v1.3.58, March 2026)
- 4.1k+ GitHub stars
- 10K+ Docker pulls
¶ History and References
Any questions?
Feel free to contact us. Find all contact information on our contact page.