doubleword-batch

Official

High-throughput batch inference, made easy.

Authordoublewordai
Version1.0.0
Installs0

System Documentation

What problem does it solve?

Submitting and coordinating large-scale, cost-efficient inference jobs can be complex and error-prone. This skill streamlines batch processing by interfacing with the Doubleword Batch API to submit, monitor, and retrieve results at scale.

Core Features & Use Cases

  • Orchestrates batch submissions, status polling, and result downloads for LLM workloads.
  • Supports the autobatcher Python client for seamless, cost-efficient batching.
  • Useful for data pipelines, model evaluation, and content generation where latency is acceptable in exchange for cost savings.

Quick Start

Submit your first batch job to Doubleword Batch API using the autobatcher client.

Dependency Matrix

Required Modules

None required

Components

Standard package

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: doubleword-batch
Download link: https://github.com/doublewordai/batch-skill/archive/main.zip#doubleword-batch

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.