vllm-omni-setup
CommunityInstall and configure vLLM-Omni across GPUs.
Authorhsliuustc0106
Version1.0.0
Installs0
System Documentation
What problem does it solve?
Simplifies the installation and configuration of vLLM-Omni across diverse environments, reducing setup time and driver troubleshooting.
Core Features & Use Cases
- Environment setup: Create and activate Python environments, install vLLM-Omni, and configure dependencies.
- Hardware compatibility: Guidance for CUDA, ROCm, NPU, and XPU backends with proper driver versions.
- Validation: Quick smoke test to verify an end-to-end setup on a new machine.
Quick Start
Install vLLM-Omni on your server by cloning the vllm-omni repository and running the installation workflow.
Dependency Matrix
Required Modules
None requiredComponents
references
💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: vllm-omni-setup Download link: https://github.com/hsliuustc0106/vllm-omni-skills/archive/main.zip#vllm-omni-setup Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.