arize-prompt-optimization

Official

Optimize LLM prompts with data.

AuthorArize-ai
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill helps you systematically improve the performance of your Large Language Model (LLM) prompts by leveraging production trace data, evaluations, and annotations, ensuring your prompts are effective and efficient.

Core Features & Use Cases

  • Prompt Extraction: Automatically identify and extract prompts from various span types within your trace data.
  • Performance Analysis: Gather and analyze performance signals like evaluation scores, human annotations, and error rates associated with specific prompts.
  • Data-Driven Optimization: Utilize a structured meta-prompting approach to generate improved prompt versions based on identified failure patterns.
  • Use Case: Debug a chatbot prompt that is generating factually incorrect responses by analyzing traces of user interactions, identifying common errors, and using the skill's meta-prompt to refine the system instructions for better accuracy.

Quick Start

Use the arize-prompt-optimization skill to extract the current prompt from the latest LLM span in your project.

Dependency Matrix

Required Modules

None required

Components

references

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: arize-prompt-optimization
Download link: https://github.com/Arize-ai/arize-skills/archive/main.zip#arize-prompt-optimization

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.