dspy-miprov2-optimizer

Community

Bayesian optimize DSPy instructions and demos

AuthorOmidZamani
Version1.0.0
Installs1

System Documentation

What problem does it solve?

This Skill performs state-of-the-art Bayesian optimization to jointly tune DSPy instructions and few-shot demonstrations, delivering higher-performing programs with fewer manual trials.

Core Features & Use Cases

  • Phase-driven workflow (Bootstrap, Propose, Search) to generate candidates, ground instructions, and explore space.
  • Supports large training sets (200+ examples) and configurable trial counts.
  • Production-grade example shows improved performance with MIPROv2.

Quick Start

  1. Set up the environment: dspy.configure(lm=dspy.LM("openai/gpt-4o-mini"))
  2. Define a RAG-based DSPy agent and run the optimizer: optimizer = dspy.MIPROv2(..., auto="medium", num_threads=24) compiled = optimizer.compile(RAGAgent(), trainset=trainset)

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: dspy-miprov2-optimizer
Download link: https://github.com/OmidZamani/dspy-skills/archive/main.zip#dspy-miprov2-optimizer

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository