llm-streaming-response-handler

Official

Real-time LLM UI streaming

Authorcuriositech
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill addresses the challenge of building responsive and engaging user interfaces for Large Language Model (LLM) interactions by enabling real-time streaming of responses.

Core Features & Use Cases

  • Real-time Token Display: Shows LLM output as it's generated, creating a dynamic "typing" effect.
  • Server-Sent Events (SSE): Leverages SSE for efficient, one-way streaming from server to client.
  • Cancellation & Error Recovery: Implements mechanisms to stop generation mid-stream and handle network or API errors gracefully.
  • Use Case: Building a chatbot interface where user messages appear instantly, or a code generation tool that displays code snippets as they are produced by the LLM.

Quick Start

Use the llm-streaming-response-handler skill to implement a real-time token display for LLM responses using Server-Sent Events.

Dependency Matrix

Required Modules

None required

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: llm-streaming-response-handler
Download link: https://github.com/curiositech/some_claude_skills/archive/main.zip#llm-streaming-response-handler

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.