local-llm-router
CommunityRoute queries to local LLMs offline.
Authorhoodini
Version1.0.0
Installs0
System Documentation
What problem does it solve?
This skill routes AI coding queries to local LLM services in air-gapped environments, enabling offline development with secure, private model routing based on Serena MCP.
Core Features & Use Cases
- Local LLM routing: Detects available local models (Ollama, LM Studio, Jan, OpenWebUI) and forwards queries to the most suitable one.
- Serena MCP integration: Uses Serena for semantic code understanding to improve symbol-level awareness and precise changes.
- Use Case: In a secure IDE workflow, route a coding task to the best local model and receive code suggestions without leaving the air gap.
Quick Start
Install Serena MCP, ensure at least one local LLM service is running (Ollama, LM Studio, Jan), verify health endpoints, and start routing queries via the provided handler.
Dependency Matrix
Required Modules
None requiredComponents
Standard package💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: local-llm-router Download link: https://github.com/hoodini/ai-agents-skills/archive/main.zip#local-llm-router Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.