hpo

Community

Tune ML model hyperparameters

AuthorKameniAlexNea
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill automates the process of finding the optimal hyperparameters for machine learning models, saving significant manual effort and computational resources.

Core Features & Use Cases

  • Bayesian Optimization: Leverages Optuna's TPE sampler for efficient hyperparameter search.
  • Fold-Level Pruning: Implements MedianPruner to cut off unpromising trials early, saving compute.
  • Model Support: Provides templates for LightGBM, XGBoost, and CatBoost.
  • Persistence: Saves results to SQLite, allowing interrupted searches to be resumed.
  • Use Case: After establishing a competitive baseline model, use this Skill to fine-tune its hyperparameters to achieve the best possible performance on your dataset.

Quick Start

Run the hyperparameter optimization script for LightGBM using your data.

Dependency Matrix

Required Modules

optunalightgbmxgboostcatboost

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: hpo
Download link: https://github.com/KameniAlexNea/gladius-agent/archive/main.zip#hpo

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.