libmemory

Official

Manage LLM context with token budgets.

Authorcopilot-ld
Version1.0.0
Installs0

System Documentation

What problem does it solve?

Memory management for building token-budgeted LLM context windows from conversation history and tools, enabling efficient prompts within strict model limits.

Core Features & Use Cases

  • WindowBuilder constructs context windows that fit within token budgets by combining history and tools.
  • MemoryIndex stores and deduplicates conversation identifiers for fast retrieval and reuse.
  • Token budgeting and overhead calculations ensure reliable prompt construction across multiple models.
  • Integration with Memory and Agent services to streamline end-to-end chat workflows.
  • Use Case: build compact, relevant conversation history for an LLM-driven assistant while respecting model limits.

Quick Start

Create a memory window by supplying the conversation history, available tools, and a token budget, then pass the resulting messages and tools to the LLM.

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: libmemory
Download link: https://github.com/copilot-ld/copilot-ld/archive/main.zip#libmemory

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.