SkillAgentSearch skills...

PyXAB

PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms

Install / Use

/learn @WilliamLwj/PyXAB

README

PyXAB - Python X-Armed Bandit

<p align="left"> <a style="border-width:0" href="https://doi.org/10.21105/joss.06507"> <img src="https://joss.theoj.org/papers/10.21105/joss.06507/status.svg" alt="DOI badge" > </a> <a href="https://zenodo.org/doi/10.5281/zenodo.13743085"> <img src="https://zenodo.org/badge/470722183.svg" alt="DOI" /> </a> <a href='https://pypi.org/project/PyXAB/'> <img src='https://img.shields.io/pypi/v/PyXAB.svg?color=yellow' alt='PyPI version' /> </a> <a href="https://codecov.io/gh/WilliamLwj/PyXAB" > <img src="https://codecov.io/gh/WilliamLwj/PyXAB/branch/main/graph/badge.svg?token=VACRX9AQBM"/> </a> <a href='https://pyxab.readthedocs.io/en/latest/?badge=latest'> <img src='https://readthedocs.org/projects/pyxab/badge/?version=latest' alt='Documentation Status' /> </a> <a href="https://github.com/WilliamLwj/PyXAB/actions/workflows/codeql.yml" target="blank"> <img src="https://github.com/WilliamLwj/PyXAB/actions/workflows/codeql.yml/badge.svg" alt="Code style: black" /> </a> <a href="https://github.com/WilliamLwj/PyXAB/actions/workflows/testing.yml" target="blank"> <img src="https://github.com/WilliamLwj/PyXAB/actions/workflows/testing.yml/badge.svg" alt="testing" /> </a> <a href="https://github.com/WilliamLwj/PyXAB/fork" target="blank"> <img src="https://img.shields.io/github/forks/WilliamLwj/PyXAB?" alt="github-PyXAB forks"/> </a> <a href="https://github.com/WilliamLwj/PyXAB/stargazers" target="blank"> <img src="https://img.shields.io/github/stars/WilliamLwj/PyXAB?" alt="github-PyXAB stars"/> </a> <a href="https://pepy.tech/project/pyxab" target="blank"> <img src="https://static.pepy.tech/badge/pyxab" alt="downloads"/> </a> <a href="https://github.com/WilliamLwj/PyXAB/blob/main/LICENSE" target="blank"> <img src="https://img.shields.io/github/license/WilliamLwj/PyXAB?color=purple" alt="github-PyXAB license" /> </a> <a href="https://github.com/psf/black" target="blank"> <img src="https://img.shields.io/badge/code%20style-black-000000.svg" alt="Code style: black" /> </a> </p>

PyXAB is a Python open-source library for X-armed bandit algorithms, a prestigious set of optimizers for online black-box optimization and hyperparameter optimization.

<p align='center'> <img src="https://raw.githubusercontent.com/WilliamLwj/PyXAB/main/figs/HCT_trajectory.gif" alt="trajectory" width="48%"/> <img src="https://raw.githubusercontent.com/WilliamLwj/PyXAB/main/figs/HCT_heatmap.gif" alt="heatmap" width="48%"/> </p>

PyXAB contains the implementations of 10+ optimization algorithms, including the classic ones such as Zooming, StoSOO, and HCT, and the most recent works such as GPO, StroquOOL and VHCT. PyXAB also provides the most commonly-used synthetic objectives to evaluate the performance of different algorithms and the implementations for different hierarchical partitions

PyXAB is featured for:

  • User-friendly APIs, clear documentation, and detailed examples
  • Comprehensive library of optimization algorithms, partitions and synthetic objectives
  • High standard code quality and high testing coverage
  • Low dependency for flexible combination with other packages such as PyTorch, Scikit-Learn

Reminder: The algorithms are maximization algorithms!

Quick Links

Quick Example

PyXAB follows a natural and straightforward API design completely aligned with the online blackbox optimization paradigm. The following is a simple 6-line usage example.

First, we define the parameter domain and the algorithm to run. At every round t, call algo.pull(t) to get a point and call algo.receive_reward(t, reward) to give the algorithm the objective evaluation (reward)

from PyXAB.algos.HOO import T_HOO

domain = [[0, 1]]               # Parameter is 1-D and between 0 and 1
algo = T_HOO(rounds=1000, domain=domain) 
for t in range(1000):
    point = algo.pull(t)
    reward = 1                  #TODO: User-defined objective returns the reward
    algo.receive_reward(t, reward)

More detailed examples can be found here

Documentations

Installation

To install via pip, run the following lines of code

pip install PyXAB                 # normal install
pip install --upgrade PyXAB       # or update if needed

To install via git, run the following lines of code

git clone https://github.com/WilliamLwj/PyXAB.git
cd PyXAB
pip install .

Features:

X-armed bandit algorithms

  • Algorithm starred are meta-algorithms (wrappers)

| Algorithm | Research Paper | Year | |-------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------| | Zooming | Multi-Armed Bandits in Metric Spaces | 2008 | | T-HOO | X-Armed Bandit | 2011 | | DOO | Optimistic Optimization of a Deterministic Function without the Knowledge of its Smoothness | 2011 | | SOO | Optimistic Optimization of a Deterministic Function without the Knowledge of its Smoothness | 2011 | | StoSOO | Stochastic Simultaneous Optimistic Optimization | 2013 | | HCT | Online Stochastic Optimization Under Correlated Bandit Feedback | 2014 | | POO* | Black-box optimization of noisy functions with unknown smoothness | 2015 | | GPO* | General Parallel Optimization Without A Metric | 2019 | | PCT | General Parallel Optimization Without A Metric | 2019 | | SequOOL | A Simple Parameter-free And Adaptive Approach to Optimization Under A Minimal Local Smoothness Assumption | 2019 | | StroquOOL | A Simple Parameter-free And Adaptive Approach to Optimization Under A Minimal Local Smoothness Assumption | 2019 | | VROOM | Derivative-Free & Order-Robust Optimisation | 2020 | | VHCT | Optimum-statistical Collaboration Towards General and Efficient Black-box Optimization | 2023 | | VPCT | N.A. (GPO + VHCT)

Related Skills

View on GitHub
GitHub Stars127
CategoryData
Updated3mo ago
Forks30

Languages

Python

Security Score

97/100

Audited on Dec 28, 2025

No findings