streaming-llm-responses

Community

Enable real-time streaming feedback in AI chats.

AuthorAbdullahMalik17
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill enables real-time streaming feedback in AI chat interfaces, allowing the UI to react to token streams and lifecycle events without blocking the user.

Core Features & Use Cases

  • Real-time streaming: manage onResponseStart, onResponseEnd, and incremental token delivery to keep the UI responsive.
  • Lifecycle events and effects: support onEffect, ProgressUpdateEvent, and client-tool patterns for dynamic UI updates.
  • Thread state synchronization: coordinate thread changes and UI locking/unlocking to ensure smooth interactions across components.

Quick Start

Use streaming-llm-responses to wire up a chat UI with real-time feedback, including response lifecycle handlers and client effects.

Dependency Matrix

Required Modules

None required

Components

references

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: streaming-llm-responses
Download link: https://github.com/AbdullahMalik17/My_skills/archive/main.zip#streaming-llm-responses

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.