Skip to main content

Why Self-Host?

Full Control

Host on your infrastructure. Your data and API keys never leave your network.

Scalable

Deploy multiple instances behind a load balancer as demand grows.

API Access

Expose agents via HTTP API for integration with other services.

Always On

Run agents 24/7 without keeping your local machine running.

Quick Start

Run the official Docker image:
docker run -d \
  -p 12233:12233 \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  -v $(pwd)/agents:/agents \
  ghcr.io/agentuse/agentuse:latest
Your AgentUse server is now running at http://localhost:12233. Test it:
curl http://localhost:12233/run \
  -H "Content-Type: application/json" \
  -d '{"agent": "/agents/hello.agentuse"}'

What’s Included

ComponentVersionPurpose
AgentUseLatestBun-compiled binary
Node.js20.xRun JavaScript scripts
Python3.12Run Python scripts
gitLatestVersion control operations
curlLatestHTTP requests
jqLatestJSON processing
bashLatestShell scripts

Configuration

Environment Variables

VariableRequiredDescription
ANTHROPIC_API_KEYYes*Anthropic Claude API key
CLAUDE_CODE_OAUTH_TOKENYes*Long-lived OAuth token from claude setup-token (valid 1 year)
OPENAI_API_KEYYes*OpenAI API key
OPENROUTER_API_KEYNoOpenRouter API key
*At least one AI provider authentication is required. For Anthropic, use either ANTHROPIC_API_KEY or CLAUDE_CODE_OAUTH_TOKEN.
To get your OAuth token, run claude setup-token in Claude Code CLI. This creates a long-lived token valid for 1 year.

Using an Environment File

docker run -d \
  -p 12233:12233 \
  --env-file .env \
  -v $(pwd)/agents:/agents \
  ghcr.io/agentuse/agentuse:latest

Including Your Agents

Option 1: Volume Mount

Mount agent files at runtime:
docker run -d \
  -p 12233:12233 \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  -v $(pwd)/agents:/agents \
  ghcr.io/agentuse/agentuse:latest
Best for development - no rebuild needed when agents change.

Option 2: Extend Base Image

Create a Dockerfile in your project:
FROM ghcr.io/agentuse/agentuse:latest

COPY ./agents /agents
Build and deploy:
docker build -t my-company/agents:latest .
docker run -d -p 12233:12233 -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY my-company/agents:latest
Best for production - self-contained, versioned, deployable.

Option 3: Docker Compose

# docker-compose.yml
version: '3.8'
services:
  agentuse:
    image: ghcr.io/agentuse/agentuse:latest
    ports:
      - "12233:12233"
    environment:
      - ANTHROPIC_API_KEY
      - OPENAI_API_KEY
    volumes:
      - ./agents:/agents
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:12233/run"]
      interval: 30s
      timeout: 10s
      retries: 3
docker-compose up -d

Extending the Base Image

Add packages your agents need:
FROM ghcr.io/agentuse/agentuse:latest

# System packages
RUN apk add --no-cache ffmpeg imagemagick

# Python packages
RUN pip3 install --no-cache-dir pandas numpy requests

# Node.js packages
RUN npm install -g typescript puppeteer

# Copy your agents
COPY ./agents /agents

Building from Source

Build for your platform:
docker build --platform linux/arm64 -t agentuse:latest .
Build and push multi-arch:
docker buildx build \
  --platform linux/amd64,linux/arm64 \
  -t myregistry/agentuse:latest \
  --push .

Production Deployment

For production use, run behind a reverse proxy with process management.

Reverse Proxy with nginx

server {
    listen 443 ssl;
    server_name agents.example.com;

    location / {
        proxy_pass http://127.0.0.1:12233;
        proxy_http_version 1.1;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_read_timeout 300s;
    }
}

Process Management with PM2

// ecosystem.config.js
module.exports = {
  apps: [{
    name: 'agentuse-server',
    script: 'npx',
    args: 'agentuse serve --port 12233',
    cwd: '/path/to/project',
    env: {
      AGENTUSE_API_KEY: 'your-secret-key',
      ANTHROPIC_API_KEY: 'sk-ant-...'
    }
  }]
}
Configure rate limiting in nginx to prevent abuse:
limit_req_zone $binary_remote_addr zone=api:10m rate=10r/s;

location / {
    limit_req zone=api burst=20 nodelay;
    proxy_pass http://127.0.0.1:12233;
}
Use Let’s Encrypt for free SSL:
certbot --nginx -d agents.example.com

Troubleshooting

Check the logs:
docker logs agentuse
Common causes:
  • Missing API keys
  • Port already in use
The official image binds to 0.0.0.0 by default. If building a custom image, ensure:
ENTRYPOINT ["/usr/local/bin/agentuse", "serve", "-H", "0.0.0.0"]
Check the volume mount:
docker exec -it agentuse ls /agents
Ensure paths in API requests match container paths.