Operator OS is a high-performance personal AI agent framework written in Go. Deploy a single self-contained binary to any device — from a Raspberry Pi to a cloud server — and connect your agents to the LLMs and messaging channels you already use.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/operatoronline/standard-operator/llms.txt
Use this file to discover all available pages before exploring further.
Quick Start
Get your agent online in under five minutes
Installation
Download binaries or build from source
Configuration
Configure models, channels, and tools
CLI Reference
Every command, flag, and option
Why Operator OS
Ultra-lightweight
Under 10MB RAM. Cold starts in under 1 second, even on a single-core 0.6GHz processor.
Single binary
Deploys as one self-contained executable across RISC-V, ARM64, and x86_64.
Persistent memory
Structural long-term memory that carries context across sessions and reboots.
Multi-channel
Slack, Discord, Telegram, WhatsApp, DingTalk, Feishu, LINE, WeCom, and QQ — out of the box.
Any LLM
OpenAI, Anthropic, Google Gemini, Groq, Ollama, DeepSeek, and more — zero code changes.
Extensible
Expand capabilities with MCP servers and installable skills from the ClawHub registry.
Get started in three steps
Initialize your workspace
Run the interactive onboarding wizard to create your configuration:This creates
~/.operator/config.json with your chosen LLM and channels.Explore the docs
AI Model providers
Configure OpenAI, Anthropic, Gemini, Ollama, and more
Messaging channels
Connect Slack, Telegram, Discord, WhatsApp, and others
Built-in tools
Shell execution, web search, filesystem, cron scheduling, and hardware I/O
MCP integration
Connect external Model Context Protocol servers
Skills marketplace
Discover and install skills from the ClawHub registry
Docker deployment
Run Operator OS fully containerized