pytorch-fsdp
CommunityPyTorch FSDP training guidance for scalable distributed DL.
Authorovachiever
Version1.0.0
Installs0
System Documentation
What problem does it solve?
This Skill provides expert guidance on Fully Sharded Data Parallel (FSDP) training in PyTorch, covering parameter sharding, mixed precision, CPU offloading, and advanced FSDP configurations for large-scale models.
Core Features & Use Cases
- FSDP Fundamentals: Understand joinable constructs, device placement, and bucketed gradient reduction.
- Deterministic Playbooks: Examples for configuring 2–8+ GPU setups with deterministic sharding.
- Advanced Topics: Mixed precision, CPU offloading, and compatibility with cutting-edge optimizations.
Quick Start
Initialize a small-scale FSDP example with a 2-GPU setup and progressively scale.
Dependency Matrix
Required Modules
None requiredComponents
references
💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: pytorch-fsdp Download link: https://github.com/ovachiever/droid-tings/archive/main.zip#pytorch-fsdp Please download this .zip file, extract it, and install it in the .claude/skills/ directory.