unsloth-cpt

Community

Optimize Unsloth for domain adaptation.

Authorcuba6112
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill addresses the challenges of continued pretraining (CPT) and domain adaptation in large language models, specifically focusing on stabilizing the training of embedding layers and language modeling heads.

Core Features & Use Cases

  • Domain Adaptation: Tailor models to new languages or specialized domains (e.g., legal, medical).
  • Embedding & LM Head Training: Optimizes training for embed_tokens and lm_head layers, crucial for new vocabulary.
  • rsLoRA Stabilization: Utilizes Rank Stabilized LoRA to maintain stability with high LoRA ranks.
  • Use Case: Adapt a general-purpose LLM to understand and generate medical reports by fine-tuning it on a corpus of medical literature, ensuring accurate terminology and context.

Quick Start

Configure a model for continued pretraining using Unsloth's specialized tools and settings.

Dependency Matrix

Required Modules

unslothtorchpeft

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: unsloth-cpt
Download link: https://github.com/cuba6112/skillfactory/archive/main.zip#unsloth-cpt

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.