distil-cli
OfficialTrain task-specific SLMs with the Distil CLI.
Software Engineering#model-training#data-preparation#local-deployment#distil-cli#SLMs#teacher-evaluation
Authordistil-labs
Version1.0.0
Installs0
System Documentation
What problem does it solve?
The Distil CLI Skill automates end-to-end training of task-specific small language models (SLMs) using the Distil Labs CLI, reducing manual steps and enabling rapid experimentation.
Core Features & Use Cases
- Data preparation: Generate and format datasets for classification, QA, and tool-calling tasks.
- Model training: Set up experiments, run teacher evaluations, and distill models locally.
- Deployment: Prepare models for local deployment with Ollama or vLLM.
- Use Case: Streamline building a classification model for customer support intents from scratch.
Quick Start
Install the Distil Labs CLI, authenticate, create a model, upload data, run teacher evaluation, run training, and download the trained model.
Dependency Matrix
Required Modules
None requiredComponents
Standard package💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: distil-cli Download link: https://github.com/distil-labs/distil-cli-skill/archive/main.zip#distil-cli Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.