Agentic AI Atlasby a5c.ai
OverviewWikiGraphFor AgentsEdgesSearchWorkspace
/
GitHubDocsDiscord
iiRecord
Agentic AI Atlas · Ollama
provider:ollamaa5c.ai
Search record views/
Record · tabs

Available views

II.Record viewspp. 1 - 1
overviewjsongraph
II.
Provider overview

provider:ollama

Reference · live

Ollama overview

Inspect the raw attributes, linked wiki pages, and inbound or outbound graph edges for provider:ollama.

ProviderOutgoing · 9Incoming · 3

Attributes

displayName
Ollama
vendor
Ollama (community)
versionRange
>=0.1.0
authMethods
  • api-key
authMethodNotes
Local-first server (default `http://localhost:11434`). No auth on the bare local server; `api-key` is used when fronting Ollama via a proxy (e.g. cloud Ollama deployments). The `api-key` enum value is selected here as the closest auth-method match.
endpoints
base
http://localhost:11434
chat
http://localhost:11434/api/chat
generate
http://localhost:11434/api/generate
embed
http://localhost:11434/api/embed
tags
http://localhost:11434/api/tags
show
http://localhost:11434/api/show
pricing
Free for local execution. Cloud-hosted Ollama deployments price independently.
rateLimitSignalingProtocol
None on the local server. HTTP 5xx surfaces upstream model-runtime errors. JSON error envelope: `{ "error": "..." }`.
dataResidencyOptions
  • on-prem
vendorFeatures
[]
slaTier
ollama-no-sla
regions
  • on-prem

Outgoing edges

emits_message_type6
  • protocol-message:ollama-chat-message·ProtocolMessageOllama /api/chat — non-streaming response
  • protocol-message:ollama-chat-stream-chunk·ProtocolMessageOllama /api/chat — streaming chunk
  • protocol-message:ollama-chat-stream-done·ProtocolMessageOllama /api/chat — streaming terminal
  • protocol-message:ollama-generate-response·ProtocolMessageOllama /api/generate — non-streaming response
  • protocol-message:ollama-generate-stream-chunk·ProtocolMessageOllama /api/generate — streaming chunk
  • protocol-message:ollama-embed-response·ProtocolMessageOllama /api/embed — response
realizes1
  • layer:2-provider·LayerProvider
serves2
  • model:llama-3-3-70b-instruct@current·ModelVersionLlama 3.3 70B Instruct
  • model:gemma-2-27b@current·ModelVersionGemma 2 27B

Incoming edges

about_subject2
  • claim:ollama-provider-native-api-endpoints·Claim
  • claim:ollama-native-protocol-messages·Claim
integrates_with1
  • tool-server:mcp-ollama·ToolServerOllama MCP Server

Related pages

No related wiki pages for this record.

Shortcuts

Open in graph
Browse node kind