chat

Community

Build AI chat apps with ease, any LLM, any time.

Authorjuanre
Version1.0.0
Installs0

System Documentation

What problem does it solve?

Starting a new LLM project or building a basic chat application often involves boilerplate code for different providers and managing conversation history. This Skill provides a unified, simple interface for core chat completions, letting you focus on your application logic.

Core Features & Use Cases

  • Unified Chat API: Send messages to OpenAI, Anthropic, Google, or Ollama using a single, consistent LLMRequest and LLMResponse structure.
  • Semantic Aliases: Use task-based names (e.g., "summarizer", "chatbot") defined in a lockfile to easily switch models without changing code, adapting to evolving needs.
  • Multi-Turn Conversations: Maintain conversation history effortlessly by appending Message objects, enabling natural, ongoing dialogues with your AI.
  • Use Case: Quickly build a simple AI assistant that can answer questions, summarize text, or engage in multi-turn conversations, with the flexibility to switch between different LLM providers as needed for cost or performance.

Quick Start

First, initialize your lockfile with 'llmring lock init' and bind 'summarizer' to a model. Then, use the chat skill to send "Hello!" to your 'summarizer' alias.

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: chat
Download link: https://github.com/juanre/llmring/archive/main.zip#chat

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository