adversarial-validation
CommunityDetect train/test distribution shift
Data & Analytics#machine learning#feature importance#adversarial validation#data drift#distribution shift#sample weighting
AuthorKameniAlexNea
Version1.0.0
Installs0
System Documentation
What problem does it solve?
This Skill identifies and helps mitigate distribution shift between training and testing datasets, which can lead to poor model generalization and a significant gap between cross-validation and leaderboard scores.
Core Features & Use Cases
- Distribution Shift Detection: Uses a classifier to distinguish between train and test data, providing an AUC score to quantify the shift.
- Leaking Feature Identification: Highlights features that are most indicative of the shift, suggesting potential data leaks or structural differences.
- Adversarial Sample Weighting: Generates weights to adjust the training process, emphasizing samples that resemble the test distribution.
- Use Case: Before training a model, run this skill to ensure your training data is representative of the test data. If a significant shift is detected (AUC > 0.55), use the identified features or sample weights to improve model robustness.
Quick Start
Run the adversarial validation script to detect distribution shift and identify leaking features.
Dependency Matrix
Required Modules
None requiredComponents
scripts
💻 Claude Code Installation
Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.
Please help me install this Skill: Name: adversarial-validation Download link: https://github.com/KameniAlexNea/gladius-agent/archive/main.zip#adversarial-validation Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
Agent Skills Search Helper
Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.