liteLLM

liteLLM

One library to standardize all LLM APIs

5.0
18 reviews

150 followers

Simplify using OpenAI, Azure, Cohere, Anthropic, Replicate, Google LLM APIs. TLDR Call all LLM APIs using the chatGPT format - completion(model, messages) Consistent outputs and exceptions for all LLM APIs Logging and Error Tracking for all models

liteLLM Reviews

The community submitted 18 reviews to tell us what they like about liteLLM, what liteLLM can do better, and more.

5.0
Based on 18 reviews
Review liteLLM?
Reviewers describe liteLLM as a practical LLM proxy that makes it easier to work across multiple providers, including Groq, OpenRouter, and local models through Ollama, while using an OpenAI-compatible API that fits into many apps when the base URL can be set. The lone user review highlights useful caching, load balancing, and pairing with Langfuse for prompt and session monitoring. Founder feedback from the makers of and echoes that it is efficient, speeds implementation, and reduces vendor lock-in.
+15
Summarized with AI
Pros
Cons
Reviews
All Reviews
Most Informative