ensembling

Community

Boost model performance through prediction combination.

AuthorKameniAlexNea
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill addresses the common machine learning challenge where individual models reach a performance plateau, enabling users to achieve higher scores by intelligently combining their predictions.

Core Features & Use Cases

  • Prediction Combination: Integrates multiple model predictions using various techniques like weighted blending, rank averaging, stacking, and greedy hill-climbing.
  • Model Diversity Assessment: Includes checks for pairwise OOF correlation to ensure ensemble members contribute unique information.
  • Use Case: After training several diverse models (e.g., LightGBM, XGBoost, a Neural Network) for a Kaggle competition, use this Skill to blend their out-of-fold predictions to create a final submission that outperforms any single model.

Quick Start

Use the ensembling skill to combine the OOF predictions from 'lgbm' and 'xgb' models with their corresponding test predictions, optimizing weights using the provided training labels 'y_train'.

Dependency Matrix

Required Modules

scipyscikit-learnpandasnumpy

Components

scripts

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: ensembling
Download link: https://github.com/KameniAlexNea/gladius-agent/archive/main.zip#ensembling

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.