SkillAgentSearch skills...

Pypop

[JMLR (CCF-A)] PyPop7: A Pure-PYthon LibrarY for POPulation-based Black-Box Optimization (BBO), especially *Large-Scale* algorithm variants (from evolutionary computation, swarm intelligence, statistics, operations research, machine learning, mathematical optimization, meta-heuristics, auto-control etc.). [https://jmlr.org/papers/v25/23-0386.html]

Install / Use

/learn @Evolutionary-Intelligence/Pypop

README

PyPop7: A Pure-PYthon librarY of POPulation-based OPtimization in continuous black-box cases [CCF-A]

<img src="https://github.com/Evolutionary-Intelligence/pypop/blob/main/docs/logo/MayorIcons.png" alt="drawing" width="21" height="21"/> PyPop7 has been used and/or cited in one Nature paper [Veenstra et al., Nature, 2025] and etc. For any questions or helps, please directly use Discussions.

JMLR-2024 arXiv-2022 Jan26-2021 PyPI Docs Downloads visitors Python Linux Apple Windows CircleCI GitHub Actions Codecov PyCharm PyTest Awesome Sphinx CCF-A

PyPop7 is a Python library of population-based randomized optimization algorithms for single-objective, real-parameter, unconstrained black-box optimization (BBO) problems. Its design goal is to provide a unified interface and a large set of elegant implementations for e.g., evolutionary algorithms, swarm optimizers, and pattern search, with three functionalities: (1) To facilitate research repeatability in a controllable manner, (2) To promote wide benchmarking in an open-source way, especially (3) To be used in real-world BBO applications in a trial-and-error manner.

Specifically speaking, for alleviating the notorious curse-of-dimensionality issue on BBO, its focus is to cover State Of The Art (SOTA) for Large-Scale Optimization only under Black-Box scenarios, though many of small- and medium-scaled algorithm versions or variants are also included here (for theoretical or benchmarking or educational or practical purposes). For a growing list of its diverse usage and/or citation cases, please refer to this online website. Although we have chosen the GPL-3.0 license, anyone could use, modify, and improve it entirely freely for any (no matter open-source or closed-source) positive purposes.

Quickstart

The following three steps are often enough to utilize the black-box optimization (BBO) power of PyPop7:

  • Recommend using pip to install pypop7 on the Python3-based virtual environment via venv or conda [but not mandatory]:
$ pip install pypop7

For using free Miniconda to download and install the virtual environment of Python, please refer to https://www.anaconda.com/docs/getting-started/miniconda/main.

For PyPop7, the number 7 was added just because pypop has been registered by other in PyPI. Its icon butterfly is used to respect/allude to the great book (butterflies in its cover) of Fisher ("(one of) the greatest of Darwin's successors"): The Genetical Theory of Natural Selection, which directly inspired Prof. Holland's proposal of Genetic Algorithms (GA).

  • Define the objective (cost / loss / error / fitness) function to be minimized for the optimization problem at hand (here the term fitness function is used, in order to follow the established terminology tradition in evolutionary computation):
import numpy as np  # for numerical computation (as the computing engine of pypop7)

# Rosenbrock (one notorious test function from the continuous optimization community)
def rosenbrock(x):
    return 100.0 * np.sum(np.square(x[1:] - np.square(x[:-1]))) + np.sum(np.square(x[:-1] - 1.0))

# to define the fitness function to be *minimized* and also its problem settings (`dict`)
ndim_problem = 1000
problem = {'fitness_function': rosenbrock,  # fitness function corresponding to the problem
           'ndim_problem': ndim_problem,  # number of dimension of the problem to be optimized
           'lower_boundary': -5.0 * np.ones((ndim_problem,)),  # lower boundary of search range
           'upper_boundary': 5.0 * np.ones((ndim_problem,))}  # upper boundary of search range

Without loss of generality, only the minimization process is considered, since maximization can be easily transferred to minimization just by negating (-) it.

  • Run one black-box optimizer (or more) on the above optimization problem. Owing to its low computational complexity and well metric-learning ability, choose LM-MA-ES as one example. Please refer to https://pypop.readthedocs.io/en/latest/es/lmmaes.html for its algorithmic procedure in detail.
# LMMAES: Limited Memory Matrix Adaptation Evolution Strategy
from pypop7.optimizers.es.lmmaes import LMMAES
# to define algorithm options (which differ in details among different optimizers)
options = {'fitness_threshold': 1e-10,  # to stop if best-so-far fitness <= 1e-10
           'max_runtime': 3600.0,  # to stop if runtime >= 1 hours (3600 seconds)
           'seed_rng': 0,  # random seed (which should be set for repeatability)
           'x': 4.0 * np.ones((ndim_problem,)),  # mean of search distribution
           'sigma': 3.0,  # global step-size (but not necessarily optimal)
           'verbose': 500}
lmmaes = LMMAES(problem, options)  # to initialize (under a unified API)
results = lmmaes.optimize()  # to run its (time-consuming) evolution process
print(results)

If this library has been used in your project or paper, please cite the following JMLR software paper (in the BibTeX format):

@article{2024-JMLR-Duan, title={{PyPop7}: A {pure-Python} library for population-based black-box optimization}, author={Duan, Qiqi and Zhou, Guochen and Shao, Chang and Others}, journal={Journal of Machine Learning Research}, volume={25}, number={296}, pages={1--28}, year={2024} }

Clearly, for obtaining the nearly optimal rate of convergence, at least one key hyper-parameter sigma of LMMAES often needs to be well fine-tuned on this popular test function rosenbrock. In practice, Hyper-Parameter Optimization (HPO) is a very common strategy to approximate the possibly best setting(s) for the complex optimization problem at hand.

Online Documentations, Online Tutorials, and Future Extensions

Please refer to https://pypop.rtfd.io/ for online documentations and tutorials of this well-designed ("self-boasted" by ourselves) Python library for Black-Box Optimization (e.g., online praises from others). A total of 4 extended versions of PyPop7 (as PP7) are ongoing or planned for further development:

  • For Constrained Black-Box Optimization (PyCoPop7 as PCP7),
  • For Noisy Black-Box Optimization (PyNoPop7 as PNP7),
  • Enhancement via Parallel and Distributed Optimization (PyDPop7 as PDP7),
  • Enhancement via Meta-evolution based Optimization (PyMePop7 as PMP7).

Black-Box Optimizers (BBO)

  • "[The main lesson of the development of our field in the last few decades is that efficient optimization methods can be developed only by intelligently employing the structure of particular instances of problems.](https://link.springer.

Related Skills

View on GitHub
GitHub Stars279
CategoryProduct
Updated2d ago
Forks39

Languages

Python

Security Score

100/100

Audited on Mar 24, 2026

No findings