llm-gateway

Community

Unified LLM API Gateway

AuthorBagelHole
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill streamlines LLM API interactions by providing a single, intelligent gateway that manages multiple LLM providers, enforces usage policies, and optimizes costs.

Core Features & Use Cases

  • Unified API Endpoint: Route requests to various LLMs (OpenAI, Anthropic, self-hosted) through one interface.
  • Cost & Usage Management: Implement rate limiting, track spending, and set budgets per user or team.
  • Intelligent Routing & Fallbacks: Automatically switch to cheaper or available models when primary ones fail.
  • Semantic Caching: Reduce API costs and latency by caching similar requests.
  • Use Case: A startup needs to integrate multiple LLMs for different tasks but wants a single API for their applications, with strict cost controls and automatic failover to ensure service availability.

Quick Start

Deploy the LiteLLM Proxy with Docker using your OpenAI and Anthropic API keys and a configuration file.

Dependency Matrix

Required Modules

None required

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: llm-gateway
Download link: https://github.com/BagelHole/DevOps-Security-Agent-Skills/archive/main.zip#llm-gateway

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.