Start
llm-streaming-response-handler
llm-streaming-response-handler - Skill Dossier
llm-streaming-response-handler

llm-streaming-response-handler

Build production LLM streaming UIs with Server-Sent Events, real-time token display, cancellation, error recovery. Handles OpenAI/Anthropic/Claude streaming APIs. Use for chatbots, AI assistants, real-time text generation. Activate on "LLM streaming", "SSE", "token stream", "chat UI", "real-time AI". NOT for batch processing, non-streaming APIs, or WebSocket bidirectional chat.

Uncategorized

Allowed Tools

ReadWriteEditBash(npm:*)

Share this skill

Coming in Spring 2026 Beta

WinDAGs will match this skill automatically. Then ask:

"Use llm-streaming-response-handler to help me build..."
Request Early Access
"Use llm-streaming-response-handler to help me build a feature system"
"I need expert help with build production llm streaming uis with server-sen..."