multi-model-research
CommunityCoordinate multiple frontier LLMs for in-depth research
Authorkrishagel
Version1.0.0
Installs0
System Documentation
What problem does it solve?
Orchestrate parallel frontier LLMs (Claude, GPT-5.1, Gemini 3.0 Pro, Perplexity Sonar, Grok 4.1) using an LLM Council pattern with peer review and synthesis to produce comprehensive research faster and with reduced bias.
Core Features & Use Cases
- Parallel multi-model querying: Run multiple models in parallel for diverse perspectives and cross-model validation.
- Peer review & chairman synthesis: Structured evaluation and synthesis produce a robust final report.
- Obsidian integration: Final reports saved to Geoffrey/Research folder for traceability.
- Current information grounding: Perplexity web grounding and citation-rich outputs.
- Deterministic workflow: From query to executive report.
Quick Start
Trigger a search with a question like "What are the latest quantum computing developments?" and review the generated Markdown report with Obsidian links.
Dependency Matrix
Required Modules
httpxpyyamlpython-dotenvpython-frontmatter
Components
scripts
💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: multi-model-research Download link: https://github.com/krishagel/geoffrey/archive/main.zip#multi-model-research Please download this .zip file, extract it, and install it in the .claude/skills/ directory.