ruvector-attention-wasm-pkg
CommunityWASM attention for transformers
Authorricable
Version1.0.0
Installs0
System Documentation
What problem does it solve?
This Skill provides highly optimized WebAssembly implementations of attention mechanisms, crucial for accelerating transformer models and LLMs, especially in resource-constrained environments like web browsers or edge devices.
Core Features & Use Cases
- High-Performance Attention: Implements MultiHead, Flash, and Hyperbolic attention with WASM for maximum speed.
- Versatile Deployment: Runs efficiently in browsers, Node.js, and edge runtimes.
- Use Case: Accelerate LLM token processing in a web application by offloading attention calculations to a fast, client-side WASM module.
Quick Start
Use the ruvector-attention-wasm-pkg skill to initialize the WASM module and then perform multi-head attention with the provided query, key, and value tensors.
Dependency Matrix
Required Modules
None requiredComponents
references
💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: ruvector-attention-wasm-pkg Download link: https://github.com/ricable/cli-skills-builder/archive/main.zip#ruvector-attention-wasm-pkg Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.