langfuse-observability
CommunityLLM observability and tracing.
Authoraleonsa
Version1.0.0
Installs0
System Documentation
What problem does it solve?
This Skill addresses the challenge of understanding and debugging complex LLM applications by providing robust observability, tracing, and evaluation capabilities.
Core Features & Use Cases
- End-to-end Tracing: Track LLM calls, agentic workflows, and spans across your application.
- Prompt Management: Version, manage, and retrieve prompts directly from Langfuse.
- Cost & Latency Monitoring: Monitor token usage, costs, and latency per user or tenant.
- Evaluation: Build and run evaluation datasets to score LLM performance.
- Use Case: Integrate Langfuse into your FastAPI application to automatically trace all LLM interactions, tag them with user and tenant IDs, and visualize performance metrics in the Langfuse UI.
Quick Start
Integrate Langfuse into your Python application by following the setup instructions in the documentation.
Dependency Matrix
Required Modules
None requiredComponents
references
💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: langfuse-observability Download link: https://github.com/aleonsa/claude-config/archive/main.zip#langfuse-observability Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.