dlt-extract

Community

Build portable DLT pipelines for files

Authordtsong
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill streamlines the process of building portable DLT pipelines, specifically for extracting data from file-based sources and supporting consulting engagements where clients need to run pipelines themselves.

Core Features & Use Cases

  • File Source Ingestion: Handles CSV, Excel, Parquet, JSON, SharePoint, and SFTP files.
  • Destination Swapping: Enables pipelines to target DuckDB for development and Snowflake/BigQuery for production using environment variables.
  • Schema Contracts: Enforces data quality and handles schema drift.
  • Portable Pipelines: Facilitates client handoff with pip install and environment variable configurations.
  • Use Case: You need to ingest daily CSV reports from a client's SFTP server, process them locally with DuckDB, and then deploy the same pipeline to load data into Snowflake.

Quick Start

Use the dlt-extract skill to create a portable DLT pipeline that reads CSV files from a local directory and loads them into DuckDB.

Dependency Matrix

Required Modules

None required

Components

references

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: dlt-extract
Download link: https://github.com/dtsong/data-engineering-skills/archive/main.zip#dlt-extract

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.