Bbopt
Black box hyperparameter optimization made easy.
Install / Use
/learn @evhub/BboptREADME
BBopt
BBopt aims to provide the easiest hyperparameter optimization you'll ever do. Think of BBopt like Keras (back when Theano was still a thing) for black box optimization: one universal interface for working with any black box optimization backend.
BBopt's features include:
- a universal API for defining your tunable parameters based on the standard library
randommodule (so you don't even have to learn anything new!), - tons of state-of-the-art black box optimization algorithms such as Gaussian Processes from
scikit-optimizeor Tree Structured Parzen Estimation fromhyperoptfor tuning parameters, - the ability to switch algorithms while retaining all previous trials and even dynamically choose the best algorithm for your use case,
- multiprocessing-safe data saving to enable running multiple trials in parallel,
- lots of data visualization methods, including support for everything in
skopt.plots, - support for optimizing over conditional parameters that only appear during some runs,
- support for all major Python versions (
2.7or3.6+), and - a straightforward interface for extending BBopt with your own custom algorithms.
Once you've defined your parameters, training a black box optimization model on those parameters is as simple as
bbopt your_file.py
and serving your file with optimized parameters as easy as
import your_file
Questions? Head over to BBopt's Gitter if you have any questions/comments/etc. regarding BBopt.
Installation
To get going with BBopt, simply install it with
pip install bbopt
or, to also install the extra dependencies necessary for running BBopt's examples, run pip install bbopt[examples].
Basic Usage
To use bbopt, just add
# BBopt setup:
from bbopt import BlackBoxOptimizer
bb = BlackBoxOptimizer(file=__file__)
if __name__ == "__main__":
bb.run()
to the top of your file, then call a random method like
x = bb.uniform("x", 0, 1)
for each of the tunable parameters in your model, and finally add
bb.maximize(y) or bb.minimize(y)
to set the value being optimized. Then, run
bbopt <your file here> -n <number of trials> -j <number of processes>
to train your model, and just
import <your module here>
to serve it!
Note: Neither __file__ nor __name__ are available in Jupyter notebooks. In that case, just setup BBopt with:
import os
# BBopt setup:
from bbopt import BlackBoxOptimizer
bb = BlackBoxOptimizer(data_dir=os.getcwd(), data_name="my_project_name")
Examples
Some examples of BBopt in action:
random_example.py: Extremely basic example using therandombackend.skopt_example.py: Slightly more complex example making use of thegaussian_processalgorithm from thescikit-optimizebackend.hyperopt_example.py: Example showcasing thetree_structured_parzen_estimatoralgorithm from thehyperoptbackend.meta_example.py: Example of using run_meta to dynamically choose an algorithm.numpy_example.py: Example which showcases how to have numpy array parameters.conditional_skopt_example.py: Example of having black box parameters that are dependent on other black box parameters using thegaussian_processalgorithm from thescikit-optimizebackend.conditional_hyperopt_example.py: Example of doing conditional parameters with thetree_structured_parzen_estimatoralgorithm from thehyperoptbackend.bask_example.py: Example of using conditional parameters with a semi-random target using thebask_gpalgorithm from thebayes-skoptbackend.pysot_example.py: Example of using the full API to implement an optimization loop and avoid the overhead of running the entire file multiple times while making use of thepySOTbackend.keras_example.py: Complete example of using BBopt to optimize a neural network built with Keras. Uses the full API to implement its own optimization loop and thus avoid the overhead of running the entire file multiple times.any_fast_example.py: Example of using the default algorithm"any_fast"to dynamically select a good backend.mixture_example.py: Example of using themixturebackend to randomly switch between different algorithms.json_example.py: Example of usingjsoninstead ofpickleto save parameters.
Full API
<!-- MarkdownTOC -->- BBopt
Command-Line Interface
The bbopt command is extremely simple in terms of what it actually does. For the command bbopt <file> -n <trials> -j <processes>, BBopt simply runs python <file> a number of times equal to <trials>, split across <processes> different processes.
Why does this work? If you're using the basic boilerplate, then running python <file> will trigger the if __name__ == "__main__": clause, which will run a training episode. But when you go to import your file, the if __name__ == "__main__": clause won't get triggered, and you'll just get served the best parameters found so far. Since the command-line interface is so simple, advanced users who want to use the full API instead of the boilerplate need not use the bbopt command at all. If you want more information on the bbopt command, just run bbopt -h.
Black Box Optimization Methods
Constructor
BlackBoxOptimizer(file, *, tag=None, protocol=None)
BlackBoxOptimizer(data_dir, data_name, *, tag=None, protocol=None)
Create a new bb object; this should be done at the beginning of your program as all the other functions are methods of this object.
file is used by BBopt to figure out where to load and save data to, and should usually just be set to __file__. tag allows additional customization of the BBopt data file for when multiple BBopt instances might be desired for the same file. Specifically, BBopt will save data to os.path.splitext(file)[0] + "_" + tag + extension.
Alternatively, data_dir and data_name can be used to specify where to save and load data to. In that case, BBopt will save data to os.path.join(data_dir, data_name + extension) if no tag is passed, or os.path.join(data_dir, data_name + "_" + tag + extension) if a tag is given.
protocol determines how BBopt serializes data. If None (the default), BBopt will use pickle protocol 2, which is the highest version that works on both Python 2 and Python 3 (unless a json
Related Skills
node-connect
337.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
83.1kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
83.1kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
337.1kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
