Welcome to Agent One
Agent One is a personal and business AI agent runtime written in Go. It receives signals from messaging platforms (WhatsApp, Telegram, ClickUp, CLI), routes them through any LLM via LiteLLM proxy, executes tools via MCP, and responds — all from a single self-hosted binary.
Why Agent One
Section titled “Why Agent One”- Single binary —
go buildproduces one executable. No Docker required. - Multi-model — route to Kimi, Claude, DeepSeek, Gemini, Ollama, or any LLM via LiteLLM.
- Multi-channel — WhatsApp, Telegram, Slack, ClickUp, CLI, webhooks.
- MCP-native — any MCP server becomes a new capability with 4 lines of config.
- Self-hosted — you own the binary, the data, and the infrastructure.
- Chat-configurable — change schedules, budgets, and channels by talking to the agent.
Quick Start
Section titled “Quick Start”# Prerequisites: Go 1.22+, LiteLLM running on localhost:4000go build -o agent-one ./cmd/agent./agent-one # Run with CLI channel./agent-one --tui # Run with terminal UIArchitecture
Section titled “Architecture”Signals ──> Event Queue ──> LLM Router ──> Tool Execution ──> Response │ ┌──────┴──────┐ │ Agent One │ │ (Go binary)│ └──────┬──────┘ │ HTTP (OpenAI format) ┌──────┴──────┐ │ LiteLLM │ │ (proxy) │ └──────┬──────┘ ┌─────┼─────┐ Kimi Claude DeepSeek Ollama ...Signals (Message, Cron, Heartbeat, Webhook, Hook) flow through an event queue to the LLM router. The router picks a model complexity level (cheap, medium, or smart) and sends the request to LiteLLM. Tool calls execute via MCP servers, and responses route back through the originating channel.
What’s Next
Section titled “What’s Next”- Installation — get Agent One running locally
- Configuration — set up models, channels, and tools
- Core Concepts — understand signals, channels, personas, and memory