letta-configuration

Community

Configure LLM models and providers.

AuthorZurybr
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill simplifies the complex process of configuring Large Language Models (LLMs) and their underlying providers for AI agents and servers, ensuring seamless integration and optimal performance.

Core Features & Use Cases

  • Model Configuration: Set up agents with specific LLM handles, adjust parameters like temperature and token limits, and define context window sizes.
  • Provider Setup: Configure various LLM providers (OpenAI, Anthropic, Azure, Ollama, etc.), including BYOK (Bring Your Own Key) options and self-hosted deployments.
  • Use Case: You need to set up a new AI agent to use the gpt-4o model from OpenAI with a temperature of 0.7 and a context window of 128,000 tokens. This Skill provides the exact code and configuration steps to achieve this.

Quick Start

Configure an agent to use the 'openai/gpt-4o' model with a temperature of 0.7 and a context window limit of 128000 tokens.

Dependency Matrix

Required Modules

None required

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: letta-configuration
Download link: https://github.com/Zurybr/lefarma-skills/archive/main.zip#letta-configuration

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.