ir-analysis

Official

Measure retrieval quality

Authorsourcegraph
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill quantifies how effectively AI agents retrieve relevant files from a codebase, comparing different retrieval methods against a known ground truth.

Core Features & Use Cases

  • Information Retrieval (IR) Metrics: Computes Precision, Recall, MRR, nDCG, and MAP for file access.
  • Config Comparison: Directly compares retrieval performance between baseline (local tools) and advanced (e.g., Sourcegraph MCP) configurations.
  • Use Case: Evaluate if an AI coding assistant is more likely to find the correct files needed for a task when using a semantic search tool versus relying solely on traditional file system commands.

Quick Start

Run the ir-analysis skill to compute retrieval metrics for all benchmarks.

Dependency Matrix

Required Modules

None required

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: ir-analysis
Download link: https://github.com/sourcegraph/CodeScaleBench/archive/main.zip#ir-analysis

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.