Getting started
Configuration
Customize your slave's brain and behavior
After installation, configure your slave with:
LLM Providers
Set your preferred AI backend. Claude Opus 4.6 is recommended for best results.
Supported providers:
- Anthropic (Claude) — recommended
- OpenAI (GPT)
- DeepSeek
- Ollama (local)
- AWS Bedrock
Channels
Enable the messaging platforms your slave should connect to. Each channel has its own credentials section.
See Channels Overview for per-channel setup guides.
Persona
The slave persona defines how your AI worker communicates. See Personas for customization.
Environment variables
All config can also be set via environment variables:
| Variable | Description |
|---|---|
SLAVE_PROVIDER | LLM provider name |
SLAVE_MODEL | Model identifier |
SLAVE_API_KEY | Provider API key |
SLAVE_CHANNELS | Comma-separated channel list |