ruvector-attention-wasm

Community

WebAssembly attention for browser/edge.

Authorricable
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill provides highly optimized attention mechanisms (FlashAttention, MultiHeadAttention, CrossAttention, LinearAttention) that run directly in the browser or on edge devices, eliminating the need for server-side computation and enabling client-side AI inference.

Core Features & Use Cases

  • Browser-Native Inference: Deploy transformer layers directly in web applications.
  • Edge Computing: Run attention computations on resource-constrained devices.
  • SIMD Acceleration: Leverages WebAssembly SIMD for significant performance gains.
  • Use Case: Integrate advanced AI capabilities like natural language processing or computer vision directly into a web app without relying on a backend API, offering a faster and more private user experience.

Quick Start

Install the attention WASM package and create a new WasmFlashAttention instance with 8 heads and a dimension of 64.

Dependency Matrix

Required Modules

None required

Components

references

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: ruvector-attention-wasm
Download link: https://github.com/ricable/cli-skills-builder/archive/main.zip#ruvector-attention-wasm

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.