consult-llm

Community

Ask smarter LLMs with precise context.

Authorraine
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill automates the process of consulting external LLMs by collecting relevant file context, selecting the appropriate model/mode (Gemini or Codex, API/CLI/web), and invoking the consult_llm MCP tool. It ensures neutral prompts and focused context to reduce noise and save you time.

Core Features & Use Cases

  • Gather and filter relevant files automatically to provide concise context for LLMs.
  • Decide the mode and model based on user instruction ("ask gemini" uses Gemini; "ask codex" uses Codex; "ask in browser" uses web mode).
  • Call the MCP tool consult_llm with API, CLI, or Web modes and present results.
  • Centralized logging and consistent output formatting to simplify comparisons and decisions.

Quick Start

In Claude Code, say: "ask gemini about the race condition in server.ts" or "ask codex about the race condition in server.ts" or "ask in browser about the race condition in server.ts". The skill will gather relevant files, determine mode and model, and present a summarized response with next steps.

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: consult-llm
Download link: https://github.com/raine/consult-llm-mcp/archive/main.zip#consult-llm

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository