Vram-GPU-OOM
CommunitySmart GPU VRAM sharing across services.
Authorlawless-m
Version1.0.0
Installs0
System Documentation
What problem does it solve?
This Skill provides GPU VRAM sharing and OOM retry patterns to coordinate memory across Ollama, Whisper, and ComfyUI, with auto-unload and signaling endpoints.
Core Features & Use Cases
- Cross-service OOM retry with delays
- Auto-unload on idle to free VRAM
- Optional signaling endpoints to request unloads
Quick Start
Implement OOM retry logic and configure auto-unload; test with smaller models before scaling up.
Dependency Matrix
Required Modules
None requiredComponents
Standard package💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: Vram-GPU-OOM Download link: https://github.com/lawless-m/Gwen/archive/main.zip#vram-gpu-oom Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.