🏋️ slaves.wtf
Getting started

Configuration

Customize your slave's brain and behavior

After installation, configure your slave with:

slaves config

LLM Providers

Set your preferred AI backend. Claude Opus 4.6 is recommended for best results.

{
  "provider": "anthropic",
  "model": "claude-opus-4-6",
  "apiKey": "sk-ant-..."
}

Supported providers:

  • Anthropic (Claude) — recommended
  • OpenAI (GPT)
  • DeepSeek
  • Ollama (local)
  • AWS Bedrock

Channels

Enable the messaging platforms your slave should connect to. Each channel has its own credentials section.

See Channels Overview for per-channel setup guides.

Persona

The slave persona defines how your AI worker communicates. See Personas for customization.

Environment variables

All config can also be set via environment variables:

VariableDescription
SLAVE_PROVIDERLLM provider name
SLAVE_MODELModel identifier
SLAVE_API_KEYProvider API key
SLAVE_CHANNELSComma-separated channel list

On this page