Skip to content
OpenCatalogcurated by FLOSSK
AI & Machine Learning

LiteLLM

Unified OpenAI-compatible proxy and SDK for 100+ model providers (local, cloud, Bedrock, Azure) with budgets, fallbacks, and logging.

Why it is included

Standard glue layer when apps must swap Ollama, vLLM, and hosted APIs without rewriting clients.

Best for

Product teams abstracting multi-provider LLM routing in one gateway.

Strengths

  • Provider breadth
  • Drop-in OpenAI API
  • Observability hooks

Limitations

  • Operational security for keys and logs is your responsibility

Good alternatives

Custom FastAPI · LangChain adapters

Related tools