Bonni
BONNI: Bayesian Optimization via Neural Network surrogates and Interior Point Optimization
Install / Use
/learn @ymahlau/BonniREADME

BONNI: Bayesian Optimization via Neural Network surrogates and Interior Point Optimization
BONNI optimizes any black box function WITH gradient information. Especially in optimizations with many degree of freedom, gradient-information increases optimization speed. In the image, the surrogate fits the function almost perfectly with few observations.

Installation
You can install BONNI simply via
pip install bonni
We recommend installing also the GPU-acceleration from JAX, which will massively increase speed:
pip install jax[cuda]
Usage
BONNI provides a nice optimization wrapper similar to the scipy.minimize API:
from bonni import optimize_bonni
from pathlib import Path
import numpy as np
def fn(x: np.ndarray):
# Input function should return function value and gradient
value = x[0] ** 2 + x[1]
grad = np.asarray([2 * x[0], 1])
return value, grad
xs, ys, gs = optimize_bonni(
fn=fn,
bounds=np.asarray([[-1, 1], [0, 1]], dtype=float),
# BO requires some samples before iterations start. You can either explicitly provide
# previous fn evals via `xs=..., ys=..., gs=... or specify a number of random samples.
num_bonni_iterations=5,
num_random_samples=2,
direction="minimize",
save_path=Path.cwd(), # save data as npz here
seed=42,
)
Additionally, BONNI includes a convenient wrapper for IPOPT. The standard IPOPT package can be difficult to install/use, so we created a convenient wrapper shown below:
from bonni import optimize_ipopt
xs, ys, gs = optimize_ipopt(
fn=fn,
x0=np.asarray([0.5, 0.5]), # startpoint of optimization
bounds=np.asarray([[-1, 1], [0, 1]], dtype=float),
# IPOPT performs line search each iteration, such that the number
# of iterations and fn_eval may not be the same
max_fn_eval=5,
max_iterations=3,
direction="maximize",
save_path=Path.cwd(),
)
Documentation
You can find the full extensive documentation of BONNI here.
Examples
Distributed Bragg Reflector

This is a 10d optimization of the layer heights of a distributed Bragg Reflector for color correction in µ-LEDs.
The target spectrum is a step function around 620nm wavelengths.
Compared to other optimization algorithms, BONNI yields the best designs.
For details, we refer to the paper.
The full code for the optimization can be found at scripts/bragg_reflector.py.
Dual-Layer Grating Coupler

This is a 62d optimization of the widths and gap sizes of a dual layer grating coupler.
Compared to other optimization algorithms, BONNI yields the best designs.
For details, we refer to the paper.
The full code for the optimization can be found at scripts/grating_coupler.py.
Citation
If you find this repository helpful for your research, please consider citing:
TODO insert citation as soon as paper online.
Other Links
Also check out my other repositories:
💡 fdtdx
A high-performance, differentiable and GPU-accelerated Finite-Difference Time-Domain solver.
Related Skills
node-connect
337.7kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
83.3kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
337.7kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
83.3kCommit, push, and open a PR
