model-routing

Community

Streamline local LLM setup.

AuthorCxxxxDxxxF
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill simplifies the process of configuring and verifying local model routing between Ollama and AirLLM, ensuring seamless one-click model loading and execution.

Core Features & Use Cases

  • Server Status Verification: Checks the health and model lists of local LLM servers.
  • Model Loading: Triggers model loading and captures precise responses for debugging.
  • Configuration Updates: Updates provider URLs and configurations post-successful model load.
  • End-to-End Testing: Confirms that generation requests work correctly after setup.
  • Error Guidance: Provides clear next steps for resolving cache or runtime issues.
  • Use Case: When you've just installed a new model locally, use this Skill to confirm it's accessible by both Ollama and AirLLM and ready for use.

Quick Start

Use the model-routing skill to verify the Ollama server status.

Dependency Matrix

Required Modules

None required

Components

references

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: model-routing
Download link: https://github.com/CxxxxDxxxF/project-blackout/archive/main.zip#model-routing

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.