cursor-explorer-mcp
CommunityFaster multi-file code exploration.
System Documentation
What problem does it solve?
This Skill helps developers perform token-efficient, multi-file codebase analysis by delegating work to a dedicated MCP cursor-agent server. It enables broad searches, architecture mapping, and flow tracing across large codebases without manual, file-by-file reading. Use this in contexts where a single-file answer would be expensive or when you need cross-file insights, not just context-limited explanations.
Core Features & Use Cases
- Batch multi-file queries: Submit batched questions to the MCP server and receive consolidated file:line references, code snippets, and purposes.
- Background, asynchronous analysis: Leverages a cursor-agent server to keep your session responsive while long analyses run.
- Cross-file tracing & architecture mapping: Understand how components interact across files and modules, reducing exploration time.
- Use Case: When you need to understand how a feature is implemented across multiple files or when tracing a data flow that spans modules, this Skill provides concrete locations and context.
Quick Start
Start a batched query to locate where a symbol is used across the codebase, e.g.: start = mcp__cursor_agent__cursor_agent_start({ "query": "Find where X is used. For each location: file:line, code snippet, and purpose." }) Wait for completion with mcp__cursor_agent__cursor_agent_result and then present the findings.
Dependency Matrix
Required Modules
None requiredComponents
Standard package💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: cursor-explorer-mcp Download link: https://github.com/sepiabrown/.claude/archive/main.zip#cursor-explorer-mcp Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.