This guide covers downloading, installing, and configuring FastGPT for running knowledge-based AI agents with RAG capabilities.
Ensure you have:
Linux/macOS:
# Pull configuration file and start FastGPT
bash <(curl -fsSL https://doc.fastgpt.cn/deploy/install.sh)
# Start FastGPT
docker compose up -d
Windows (PowerShell):
# Download docker-compose.yml
Invoke-WebRequest -Uri "https://raw.githubusercontent.com/labring/FastGPT/main/projects/app/docker-compose.yml" -OutFile "docker-compose.yml"
# Start FastGPT
docker compose up -d
After startup (wait 2-3 minutes):
root1234⚠️ Change default password immediately after first login!
Step 1: Create directory
mkdir -p ~/fastgpt && cd ~/fastgpt
Step 2: Download docker-compose.yml
curl -o docker-compose.yml https://raw.githubusercontent.com/labring/FastGPT/main/projects/app/docker-compose.yml
Step 3: Start services
docker compose up -d
Step 4: Check logs
docker compose logs -f
Sealos provides one-click FastGPT deployment:
Prerequisites:
Clone and install:
git clone https://github.com/labring/FastGPT.git
cd FastGPT
pnpm install
Configure environment:
cp .env.example .env.local
# Edit .env.local with your settings
Start development:
pnpm dev
| Component | Requirement |
|---|---|
| CPU | 2 cores |
| RAM | 4 GB |
| Disk | 20 GB |
| OS | Linux, macOS, Windows |
| Component | Requirement |
|---|---|
| CPU | 4+ cores |
| RAM | 8+ GB |
| Disk | 50+ GB SSD |
| OS | Linux (Ubuntu 20.04+) |
| Component | Requirement |
|---|---|
| CPU | 8+ cores |
| RAM | 16+ GB |
| Disk | 100+ GB SSD |
| Network | 100Mbps+ |
| Runtime | Kubernetes cluster |
Create or edit .env file:
# Database
DB_URL=mongodb://mongodb:27017/fastgpt
PG_URL=postgresql://postgres:postgres@postgresql:5432/fastgpt
# API Keys
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
# Security
ROOT_KEY=your-root-key
TOKEN_KEY=your-token-key
# Optional
LLM_REQUEST_TRACKING_RETENTION_HOURS=6
| Credential | Default | Change Required |
|---|---|---|
| Username | root | ✅ Yes |
| Password | 1234 | ✅ Yes |
Change password immediately:
Settings → Model Providers:
OpenAI:
Anthropic:
Local Models:
| Provider | Models |
|---|---|
| OpenAI | GPT-4, GPT-4 Turbo, GPT-3.5 Turbo |
| Anthropic | Claude 3.5 Sonnet, Claude 3 Opus |
| Qwen | Qwen 2.5, Qwen Max |
| DeepSeek | DeepSeek V2, DeepSeek Coder |
| Local | Ollama models, llama.cpp |
Supported Formats:
| Method | Description |
|---|---|
| Direct Upload | Upload files directly |
| URL Import | Import from web URLs |
| QA Split | Import Q&A pairs |
| CSV Bulk | Bulk import via CSV |
| Node Type | Purpose |
|---|---|
| Start | Workflow entry point |
| LLM | AI model inference |
| Knowledge Base | RAG retrieval |
| HTTP Request | External API calls |
| Code | Custom JavaScript |
| Response | Output generation |
| Plugin | Reusable components |
FastGPT provides OpenAI-compatible API:
curl http://localhost:3000/api/v1/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"app_id": "your-app-id",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'
# Check logs
docker compose logs
# Verify resources
docker stats
# Restart services
docker compose restart
# Check MongoDB status
docker compose ps mongodb
# View MongoDB logs
docker compose logs mongodb
# Verify connection string
docker compose exec fastgpt env | grep DB_URL
# Check if port is listening
netstat -tlnp | grep 3000
# Check container status
docker compose ps
cd ~/fastgpt
# Pull latest images
docker compose pull
# Restart with new version
docker compose up -d
# Remove old images
docker image prune -f
# Backup MongoDB
docker compose exec mongodb mongodump --out /backup
# Backup PostgreSQL
docker compose exec postgresql pg_dump -U postgres fastgpt > backup.sql
# Backup config
cp docker-compose.yml docker-compose.yml.backup
cp .env .env.backup
Any questions?
Feel free to contact us. Find all contact information on our contact page.