II.
Tool overview
Reference · livetool:litellm
LiteLLM overview
Python proxy and SDK that provides a unified OpenAI-compatible interface to 100+ LLM providers including Anthropic, Azure, Bedrock, Vertex AI, Ollama, and Hugging Face. Features include load balancing, fallbacks, spend tracking, rate limiting, and a proxy server mode that lets any OpenAI-compatible client talk to any backend.
Attributes
displayName
LiteLLM
homepageUrl
kind
other
description
Python proxy and SDK that provides a unified OpenAI-compatible
interface to 100+ LLM providers including Anthropic, Azure, Bedrock,
Vertex AI, Ollama, and Hugging Face. Features include load balancing,
fallbacks, spend tracking, rate limiting, and a proxy server mode
that lets any OpenAI-compatible client talk to any backend.
Outgoing edges
alternative_to3
- tool:openrouter·ToolOpenRouter
- tool:portkey-ai·ToolPortkey AI
- tool:helicone·ToolHelicone
belongs_to_language1
- language:python·LanguagePython
tool_used_by2
- skill-area:ai-agent-development·SkillAreaAI Agent Development
- skill-area:model-serving-operations·SkillAreaModel Serving
Incoming edges
alternative_to3
- tool:openrouter·ToolOpenRouter
- tool:portkey-ai·ToolPortkey AI
- tool:helicone·ToolHelicone
composed_of1
- stack-profile:ai-agent-stack·StackProfileAI Agent Stack (LLM, Vector DB, Orchestration, Memory)