chatkit-python

Community

Build fast chat APIs for OpenAI ChatKit.

AuthorNaimalArain13
Version1.0.0
Installs0

System Documentation

What problem does it solve?

ChatKit Python backend skill provides a FastAPI-based backend to power the OpenAI ChatKit frontend, enabling a scalable chat API with SSE streaming, conversation persistence, and MCP integration.

Core Features & Use Cases

  • SSE streaming chat responses: Real-time, chunked responses delivered to chat clients with minimal latency.
  • Conversation persistence: Store and retrieve threads and messages to maintain context across sessions.
  • MCP and Gemini integration: Use MCP tools and Gemini via LiteLLM for enhanced agent-assisted workflows, including task management.

Quick Start

  • Create and activate a Python virtual environment.
  • Install dependencies: pip install fastapi sse-starlette "openai-agents[litellm]"
  • Run the server: uvicorn main:app --reload

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: chatkit-python
Download link: https://github.com/NaimalArain13/Hackathon-II_The-Evolution-of-Todo/archive/main.zip#chatkit-python

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.