tensorflow-lite

Community

Run on-device ML in React Native apps.

Author333-333-333
Version1.0.0
Installs0

System Documentation

What problem does it solve?

TensorFlow Lite on-device inference enables React Native apps to run ML models without relying on cloud connectivity, reducing latency and preserving user privacy.

Core Features & Use Cases

  • On-device model loading and inference for object detection (COCO-SSD) and image classification (MobileNet).
  • Memory-aware execution with efficient tensor management and optional parallel processing for higher throughput.
  • Use Case: Build offline vision features like real-time scene understanding in mobile apps.

Quick Start

Install TensorFlow.js and the React Native adapter, initialize the library, and load your first model to begin local inference.

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: tensorflow-lite
Download link: https://github.com/333-333-333/iris/archive/main.zip#tensorflow-lite

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.