iiRecord
Agentic AI Atlas · Supports local models
capability:supports-local-modelsa5c.ai
II.
Capability JSON

capability:supports-local-models

Structured · live

Supports local models json

Inspect the normalized record payload exactly as the atlas UI reads it.

File · capabilities/capabilities/missing-universal-capabilities.yamlCluster · capabilities
Record JSON
{
  "id": "capability:supports-local-models",
  "_kind": "Capability",
  "_file": "capabilities/capabilities/missing-universal-capabilities.yaml",
  "_cluster": "capabilities",
  "attributes": {
    "displayName": "Supports local models",
    "description": "The agent can route inference to a local model server (Ollama,\nllama.cpp, vLLM, LM Studio) instead of a hosted provider. Pairs\nwith the Local Model Source provider sub-component (Layer 2).\n",
    "appliesToNodeKinds": [
      "AgentVersion",
      "AgentCoreImpl"
    ],
    "category": "local-inference"
  },
  "outgoingEdges": [],
  "incomingEdges": [
    {
      "from": "agentVersion:claude:ge-0-0-0",
      "to": "capability:supports-local-models",
      "kind": "supports",
      "attributes": {
        "versionRange": ">=1.0.0 <2.0.0",
        "level": "partial",
        "notes": "Claude Code routes inference through Anthropic / Bedrock / Vertex by default; local-model routing is via transport-mux + a Local Model Source provider configured in settings."
      }
    }
  ]
}