SkillAgentSearch skills...

HyperSHAP

HyperSHAP is a holistic framework for explaining hyperparameter optimization based on Shapley values and interactions.

Install / Use

/learn @automl/HyperSHAP
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

HyperSHAP <img src="https://raw.githubusercontent.com/automl/hypershap/main/docs/source/_static/logo/hypershap-logo.png" alt="HyperSHAP Logo" align="right" height="200px"/>

Release Build status Coverage Status Commit activity License

HyperSHAP – a game‑theoretic Python library for explaining Hyperparameter Optimization (HPO). It uses Shapley values and interaction indices to provide both local and global insights into how individual hyper‑parameters (and their interactions) affect a model’s performance.

Table of Contents


Features

  • Additive Shapley decomposition of any performance metric across hyper‑parameters.
  • Interaction analysis via the Faithful Shapley Interaction Index (FSII).
  • Ready‑made explanation tasks for Ablation, Tunability, and Optimizer Bias studies.
  • Integrated visualisation (SI‑graph) for interaction effects.
  • Works with any surrogate model that follows the ExplanationTask interface.

Installation

First, create a virtual environment, e.g., via conda:

$ conda create -n hypershap python=3.10
$ conda activate hypershap

Now, you can just pip install HyperSHAP as follows:

$ pip install hypershap

Or, clone the git repository and install hypershap via the Makefile:

$ git clone https://github.com/automl/hypershap
$ cd hypershap
$ make install

Getting Started

Given an existing setup with a ConfigurationSpace from the ConfigSpace package and black-box function as follows:

from ConfigSpace import ConfigurationSpace, Configuration

# ConfigurationSpace describing the hyperparameter space
cs = ConfigurationSpace()
  ...

# A black-box function, evaluating ConfigSpace.Configuration objects
def blackbox_function(cfg: Configuration) -> float:
  ...

You can use HyperSHAP as follows:

from hypershap import ExplanationTask, HyperSHAP

# Instantiate HyperSHAP
hypershap = HyperSHAP(ExplanationTask.from_function(config_space=cs,function=blackbox_function))
# Conduct tunability analysis
hypershap.tunability(baseline_config=cs.get_default_configuration())
# Plot results as a Shapley Interaction graph
hypershap.plot_si_graph()

The example demonstrates how to:

  1. Wrap a black-box function in an explanation task.
  2. Use HyperSHAP to obtain interaction values for the tunability game.
  3. Plot the corresponding SI-graph.

API Overview

| Method | Purpose | |---------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | HyperSHAP(explanation_task, n_workers, max_hyperparameters_exact, approximation_budget) | Initialize the explainer with a generic ExplanationTask. Optionally, you may activate parallelization by setting n_workers to the number of CPU cores you would like to use for parallelization. max_hyperparameters_exact determines until which number of hyperparameters exact Shapley values will be computed and beyond which a budget of approximation_budget will be used to approximate them. | | ablation(config_of_interest, baseline_config, index="FSII", order=2) | Explain the contribution of each hyperparameter value (and interactions) when moving from a baseline to a specific configuration. | | ablation_multibaseline(config_of_interest, baseline_config, aggregation, index="FSII", order=2) | Explain the contribution of each hyperparameter value (and interactions) when moving from different baselines to a specific configuration. Values are aggregated via a given aggregation operator. | | tunability(baseline_config=None, index="FSII", order=2, n_samples=10_000) | Quantify how much performance can be gained by tuning subsets of hyperparameters. | | sensitivity(baseline_config=None, index="FSII", order=2, n_samples=10_000) | Quantify how much performance variance can be gained by varying subsets of hyperparameters. | | mistunability(baseline_config=None, index="FSII", order=2, n_samples=10_000) | Quantify how much performance can be lost due to mistuning a (subsets of) hyperparameter(s). | | optimizer_bias(optimizer_of_interest, optimizer_ensemble, index="FSII", order=2) | Attribute performance differences to a particular optimizer vs. an ensemble of optimizers. | | plot_si_graph(interaction_values=None, save_path=None) | Plot the Shapley Interaction (SI) graph; uses the most recent interaction values if none are supplied. | | plot_upset(interaction_values=None, save_path=None) | Plot interaction values in the form of an upset plot; uses the most recent interaction values if none are supplied. | | plot_force(interaction_values=None, save_path=None) | Plot interaction values in the form of a force plot; uses the most recent interaction values if none are supplied. | | plot_waterfall(interaction_values=None, save_path=None) | Plot interaction values in the form of a waterfall plot; uses the most recent interaction values if none are supplied.

View on GitHub
GitHub Stars20
CategoryDevelopment
Updated7d ago
Forks4

Languages

Python

Security Score

90/100

Audited on Mar 29, 2026

No findings