II.
Provider overview
Reference · liveprovider:ollama
Ollama overview
Inspect the raw attributes, linked wiki pages, and inbound or outbound graph edges for provider:ollama.
Attributes
displayName
Ollama
vendor
Ollama (community)
versionRange
>=0.1.0
authMethods
- api-key
authMethodNotes
Local-first server (default `http://localhost:11434`). No auth on the
bare local server; `api-key` is used when fronting Ollama via a proxy
(e.g. cloud Ollama deployments). The `api-key` enum value is selected
here as the closest auth-method match.
endpoints
pricing
Free for local execution. Cloud-hosted Ollama deployments price
independently.
rateLimitSignalingProtocol
None on the local server. HTTP 5xx surfaces upstream model-runtime
errors. JSON error envelope: `{ "error": "..." }`.
dataResidencyOptions
- on-prem
vendorFeatures
[]
slaTier
ollama-no-sla
regions
- on-prem
Outgoing edges
emits_message_type6
- protocol-message:ollama-chat-message·ProtocolMessageOllama /api/chat — non-streaming response
- protocol-message:ollama-chat-stream-chunk·ProtocolMessageOllama /api/chat — streaming chunk
- protocol-message:ollama-chat-stream-done·ProtocolMessageOllama /api/chat — streaming terminal
- protocol-message:ollama-generate-response·ProtocolMessageOllama /api/generate — non-streaming response
- protocol-message:ollama-generate-stream-chunk·ProtocolMessageOllama /api/generate — streaming chunk
- protocol-message:ollama-embed-response·ProtocolMessageOllama /api/embed — response
realizes1
- layer:2-provider·LayerProvider
serves2
- model:llama-3-3-70b-instruct@current·ModelVersionLlama 3.3 70B Instruct
- model:gemma-2-27b@current·ModelVersionGemma 2 27B
Incoming edges
about_subject2
integrates_with1
- tool-server:mcp-ollama·ToolServerOllama MCP Server