libllm
OfficialConnect to OpenAI-compatible LLMs via HTTP.
Authorcopilot-ld
Version1.0.0
Installs0
System Documentation
What problem does it solve?
Provides a dedicated LLM API client that communicates directly with OpenAI-compatible endpoints to perform chat completions and generate embeddings, simplifying integration in AI-powered apps.
Core Features & Use Cases
- Chat completions and text embeddings via HTTP with support for GitHub Models, Azure OpenAI, and standard OpenAI endpoints.
- Streaming responses, token counting, and multi-tool parallel call handling to enable complex agent workflows.
- Use cases include building AI assistants, knowledge-base querying, and vector-based retrieval systems in production services.
Quick Start
Instantiate an LlmApi client with your token and base URLs, then call completion() or embed() to start interacting with your LLM.
Dependency Matrix
Required Modules
None requiredComponents
Standard package💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: libllm Download link: https://github.com/copilot-ld/copilot-ld/archive/main.zip#libllm Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.