Agentic AI Atlasby a5c.ai
OverviewWikiGraphFor AgentsEdgesSearchWorkspace
/
GitHubDocsDiscord
iiRecord
Agentic AI Atlas · OpenCode
page:docs-agent-mux-reference-agents-opencodea5c.ai
Search record views/
Record · tabs

Available views

II.Record viewspp. 1 - 1
overviewarticlejsongraph
II.
Page JSON

page:docs-agent-mux-reference-agents-opencode

Structured · live

OpenCode json

Inspect the normalized record payload exactly as the atlas UI reads it.

File · wiki/docs/agent-mux/reference/agents/opencode.mdCluster · wiki
Record JSON
{
  "id": "page:docs-agent-mux-reference-agents-opencode",
  "_kind": "Page",
  "_file": "wiki/docs/agent-mux/reference/agents/opencode.md",
  "_cluster": "wiki",
  "attributes": {
    "nodeKind": "Page",
    "sourcePath": "docs/agent-mux/reference/agents/opencode.md",
    "sourceKind": "repo-docs",
    "title": "OpenCode",
    "displayName": "OpenCode",
    "slug": "docs/agent-mux/reference/agents/opencode",
    "articlePath": "wiki/docs/agent-mux/reference/agents/opencode.md",
    "article": "\n# OpenCode\n\nAdapters for the **OpenCode** multi-provider AI coding agent from [anomalyco/opencode](https://github.com/anomalyco/opencode).\n\n⚠️ **Important**: This is **not** the same as the deprecated `opencode-ai/opencode` project (now `charmbracelet/crush`). Agent-mux specifically supports the `anomalyco/opencode` implementation.\n\n## Adapter Variants\n\nAgent-mux provides two OpenCode adapters:\n\n- **`opencode`** (subprocess) — Default CLI-based integration\n- **`opencode-http`** (remote) — HTTP server with SSE streaming for enhanced performance\n\n### Choosing an Adapter\n\n| Adapter | Type | Use Case | Performance |\n|---------|------|----------|-------------|\n| `opencode` | Subprocess | General use, simple setup | Standard |\n| `opencode-http` | Remote (HTTP) | High-performance, real-time streaming | Enhanced |\n\n**Recommendation**: Use `opencode-http` for production workloads or when performance is critical. Use `opencode` for simple use cases or debugging.\n\n## Install\n\n```bash\n# Install the CLI (required for both adapters)\namux install opencode\n```\n\nInstall methods:\n- npm: `npm install -g @anomalyco/opencode`\n- brew (macOS): `brew install --cask opencode`\n- curl: `curl -fsSL https://opencode.ai/install | bash`\n\nSupported on macOS, Linux and Windows.\n\n## Auth\n\nOpenCode supports multiple AI providers. Configure authentication using:\n\n```bash\nopencode auth\n```\n\nOr set environment variables directly:\n- `ANTHROPIC_API_KEY` — for Claude models\n- `OPENAI_API_KEY` — for GPT models  \n- `GOOGLE_API_KEY` — for Gemini models\n\nConfig file: `~/.config/opencode/config.json`.\n\n## Minimal run\n\n### CLI Examples\n\n```bash\n# Subprocess adapter (default)\namux run opencode --prompt \"Add error handling to this function\"\n\n# HTTP adapter (enhanced performance)\namux run opencode-http --prompt \"Add error handling to this function\"\n```\n\n### Programmatic Examples\n\n```ts\nimport { createClient } from '@a5c-ai/agent-mux';\nconst client = createClient();\n\n// Subprocess adapter\nconst handle1 = await client.run({ agent: 'opencode', prompt: 'Refactor this code' });\nfor await (const ev of handle1.events()) console.log(ev);\n\n// HTTP adapter\nconst handle2 = await client.run({ agent: 'opencode-http', prompt: 'Refactor this code' });\nfor await (const ev of handle2.events()) console.log(ev);\n```\n\n## Notable flags\n\nThe adapter forwards these `RunOptions` to the CLI:\n\n- `--format json` (always, for structured output parsing)\n- `--model <id>` — e.g. `claude-3-5-sonnet-20241022` or `gpt-4o`\n- `--session <id>` — create or resume a session\n- `--continue` — continue an existing session\n- `--fork <id>` — fork from an existing session\n- `--max-turns <n>` — limit conversation length\n- `--system <prompt>` — set system prompt\n\n## Session files\n\n- Location: `~/.config/opencode/sessions/*.jsonl`\n- Format: JSONL with OpenCode event structure\n- Resume and fork are both supported (`canResume`, `canFork`)\n\n## Plugins\n\nPlugin support: **yes** — MCP server integration.\n\n### MCP Servers\n```bash\namux mcp install opencode <mcp-server>\namux mcp list opencode\n```\n\nRegistry: https://modelcontextprotocol.io for MCP servers.\n\n## Models\n\nOpenCode supports multiple AI providers:\n\n| Model ID | Provider | Context | Notes |\n|----------|----------|---------|-------|\n| `claude-3-5-sonnet-20241022` | Anthropic | 200k | Default model |\n| `gpt-4o` | OpenAI | 128k | Alternative option |\n\nAdditional models available via provider configuration.\n\n## Capabilities\n\n### Shared Capabilities (both adapters)\n\n- **Text streaming** with real-time output\n- **Tool calling** with parallel execution support\n- **JSON mode** and structured output\n- **Image input** and file attachments\n- **Multi-turn conversations** with session persistence\n- **Subagent dispatch** (up to 5 parallel tasks)\n- **Interactive mode** with stdin injection\n- **Skills** and `AGENTS.md` support\n- **MCP server** plugin integration\n\n### HTTP Adapter Enhancements (`opencode-http`)\n\n- **Server-Sent Events (SSE)** streaming for reduced latency\n- **HTTP REST API** for direct integration\n- **Automatic server management** with health monitoring\n- **Enhanced connection pooling** for multiple concurrent sessions\n\n## HTTP Adapter Details\n\nThe `opencode-http` adapter manages an OpenCode server automatically:\n\n### Server Management\n- **Automatic startup**: Server starts via `opencode serve` when first connection is made\n- **Health monitoring**: Regular health checks ensure server availability\n- **Graceful shutdown**: Server cleanup on adapter disposal\n- **Port management**: Automatic port allocation to avoid conflicts\n\n### Connection Details\n- **Endpoint**: Typically `http://localhost:<dynamic-port>`\n- **SSE Streaming**: `/api/chat/stream` endpoint for real-time events\n- **Authentication**: Inherits OpenCode's configured auth providers\n- **Session persistence**: Maintains session state across requests\n\n### When to Use\n- **High throughput**: Multiple concurrent conversations\n- **Low latency**: Direct HTTP eliminates subprocess overhead\n- **Integration**: REST API for custom tooling\n- **Production**: Enhanced reliability and monitoring\n\n## Known limitations\n\n### Both Adapters\n- **No thinking streams** — OpenCode doesn't expose internal reasoning\n- **No image output** — text-only responses\n- **Tool approval required** — use `approvalMode: 'yolo'` to auto-approve in trusted environments\n\n### Subprocess Adapter (`opencode`)\n- **No PTY support** — runs in standard subprocess mode\n\n### HTTP Adapter (`opencode-http`)\n- **Server dependency** — requires successful `opencode serve` startup\n- **Port requirements** — needs available TCP port for server\n",
    "documents": []
  },
  "outgoingEdges": [],
  "incomingEdges": [
    {
      "from": "page:docs-agent-mux-reference",
      "to": "page:docs-agent-mux-reference-agents-opencode",
      "kind": "contains_page"
    }
  ]
}

Shortcuts

Back to overview
Open graph tab