SkillAgentSearch skills...

Bonni

BONNI: Bayesian Optimization via Neural Network surrogates and Interior Point Optimization

Install / Use

/learn @ymahlau/Bonni
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

title image

Documentation PyPI version codecov Continuous integration

BONNI: Bayesian Optimization via Neural Network surrogates and Interior Point Optimization

BONNI optimizes any black box function WITH gradient information. Especially in optimizations with many degree of freedom, gradient-information increases optimization speed. In the image, the surrogate fits the function almost perfectly with few observations.

surrogate image

Installation

You can install BONNI simply via

pip install bonni

We recommend installing also the GPU-acceleration from JAX, which will massively increase speed:

pip install jax[cuda]

Usage

BONNI provides a nice optimization wrapper similar to the scipy.minimize API:

from bonni import optimize_bonni
from pathlib import Path
import numpy as np

def fn(x: np.ndarray):
    # Input function should return function value and gradient
    value = x[0] ** 2 + x[1]
    grad = np.asarray([2 * x[0], 1])
    return value, grad

xs, ys, gs = optimize_bonni(
    fn=fn,
    bounds=np.asarray([[-1, 1], [0, 1]], dtype=float),
    # BO requires some samples before iterations start. You can either explicitly provide 
    # previous fn evals via `xs=..., ys=..., gs=... or specify a number of random samples. 
    num_bonni_iterations=5,
    num_random_samples=2,
    direction="minimize",
    save_path=Path.cwd(), # save data as npz here
    seed=42,
)

Additionally, BONNI includes a convenient wrapper for IPOPT. The standard IPOPT package can be difficult to install/use, so we created a convenient wrapper shown below:

from bonni import optimize_ipopt
xs, ys, gs = optimize_ipopt(
    fn=fn,
    x0=np.asarray([0.5, 0.5]),  # startpoint of optimization
    bounds=np.asarray([[-1, 1], [0, 1]], dtype=float),
    # IPOPT performs line search each iteration, such that the number 
    # of iterations and fn_eval may not be the same
    max_fn_eval=5,
    max_iterations=3,
    direction="maximize",
    save_path=Path.cwd(),
)

Documentation

You can find the full extensive documentation of BONNI here.

Examples

Distributed Bragg Reflector

dbr image

This is a 10d optimization of the layer heights of a distributed Bragg Reflector for color correction in µ-LEDs. The target spectrum is a step function around 620nm wavelengths. Compared to other optimization algorithms, BONNI yields the best designs. For details, we refer to the paper. The full code for the optimization can be found at scripts/bragg_reflector.py.

Dual-Layer Grating Coupler

gc image

This is a 62d optimization of the widths and gap sizes of a dual layer grating coupler. Compared to other optimization algorithms, BONNI yields the best designs. For details, we refer to the paper. The full code for the optimization can be found at scripts/grating_coupler.py.

Citation

If you find this repository helpful for your research, please consider citing:

TODO insert citation as soon as paper online.

Other Links

Also check out my other repositories:

💡 fdtdx

GitHub stars GitHub forks

A high-performance, differentiable and GPU-accelerated Finite-Difference Time-Domain solver.

Related Skills

View on GitHub
GitHub Stars28
CategoryDevelopment
Updated15d ago
Forks0

Languages

Python

Security Score

95/100

Audited on Mar 11, 2026

No findings