SkillAgentSearch skills...

Parmoo

Python library for parallel multiobjective simulation optimization

Install / Use

/learn @parmoo/Parmoo

README

.. image:: docs/img/logo-ParMOO.svg :align: center :alt: ParMOO

|

.. image:: https://img.shields.io/badge/License-BSD_3--Clause-green.svg :target: https://opensource.org/licenses/BSD-3-Clause :alt: License

.. image:: https://img.shields.io/pypi/v/parmoo.svg?color=green :target: https://pypi.org/project/parmoo

.. image:: https://github.com/parmoo/parmoo/actions/workflows/parmoo-ci.yml/badge.svg?/branch=main :target: https://github.com/parmoo/parmoo/actions

.. image:: https://readthedocs.org/projects/parmoo/badge/?maxAge=2592000 :target: https://parmoo.readthedocs.org/en/latest :alt: Documentation Status

.. image:: https://joss.theoj.org/papers/10.21105/joss.04468/status.svg :target: https://doi.org/10.21105/joss.04468 :alt: JOSS DOI

.. image:: https://coveralls.io/repos/github/parmoo/parmoo/badge.svg?branch=main :target: https://coveralls.io/github/parmoo/parmoo?branch=main

|

ParMOO: Python library for parallel multiobjective simulation optimization

ParMOO is a parallel multiobjective optimization solver that seeks to exploit simulation-based structure in objective and constraint functions.

To exploit structure, ParMOO models simulations separately from objectives and constraints. In our language:

  • a design variable is an input to the problem, which we can directly control;
  • a simulation is an expensive or time-consuming process, including real-world experimentation, which is treated as a blackbox function of the design variables and evaluated sparingly;
  • an objective is an algebraic function of the design variables and/or simulation outputs, which we would like to optimize; and
  • a constraint is an algebraic function of the design variables and/or simulation outputs, which cannot exceed a specified bound.

.. figure:: docs/img/des-sim-obj-space.png :alt: Designs, simulations, and objectives :align: center

|

To solve a multiobjective optimization problem (MOOP), we use surrogate models of the simulation outputs, together with the algebraic definition of the objectives and constraints.

ParMOO is implemented in Python. In order to achieve scalable parallelism, we use libEnsemble_ to distribute batches of simulation evaluations across parallel resources.

Dependencies

ParMOO has been tested on Unix/Linux and MacOS systems.

ParMOO's base has the following dependencies:

  • Python_ 3.9+
  • jax_ -- for algorithmic differentiation and just-in-time (jit) compilation
  • numpy_ -- for data structures and performant numerical linear algebra
  • scipy_ -- for scientific calculations needed for specific modules
  • pandas_ -- for exporting the resulting databases

Additional dependencies are needed to use the additional features in parmoo.extras:

  • libEnsemble_ -- for managing parallel simulation evaluations

And for using the Pareto front visualization library in parmoo.viz:

  • plotly_ -- for generating interactive plots
  • dash_ -- for hosting interactive plots in your browser
  • kaleido_ -- for exporting static plots post-interaction

Installation

The easiest way to install ParMOO is via the Python package index, PyPI (commonly called pip):

.. code-block:: bash

pip install < --user > parmoo

where the braces around < --user > indicate that the --user flag is optional.

To install all dependencies (including libEnsemble) use:

.. code-block:: bash

pip install < --user > "parmoo[extras]"

Note that the full feature set for libEnsemble_ and kaleido_ may require you to separately install an MPI implementation (such as Open_MPI_) and Google chrome (e.g., via kaleido_get_chrome_), respectively.

You can also clone this project from our GitHub_ and pip install it in-place, so that you can easily pull the latest version or checkout the develop branch for pre-release features. On Debian-based systems with a bash shell, this looks like:

.. code-block:: bash

git clone https://github.com/parmoo/parmoo cd parmoo pip install -e .

Alternatively, the latest release of ParMOO (including all required and optional dependencies) can be installed from the conda-forge channel using:

.. code-block:: bash

conda install --channel=conda-forge parmoo

Before doing so, it is recommended to create a new conda environment using:

.. code-block:: bash

conda create --name channel-name conda activate channel-name

Testing

Note that in order to run the unit tests, you must first install the parmoo[extras], as described above. This may include the additional steps such as kaleido_get_chrome.

You can install pytest_ with the pytest-cov_ plugin and flake8_ using the tests extension, then you can lint the project with flake8 and run the unit tests for your installation using pytest. After running unit tests, you can view the coverage report using the coverage report command.

.. code-block:: bash

pip install -e ".[tests]" flake8 parmoo pytest coverage report

Running the regression tests and libensemble tests is a bit more involved and is usually accomplished via the -l flag for the parmoo/tests/run-tests.sh script.

To run all the linter, unit tests, regression tests, and generate the coverage report in a single command, run the script with all 4 flags set.

.. code-block::bash

./parmoo/tests/run-tests.sh -curl

These tests are run regularly using GitHub Actions_.

Basic Usage

ParMOO uses numpy_ and jax_ in an object-oriented design, based around the MOOP class.

Before getting started, note that jax_ runs in single (32-bit) precision by default. To run in double precision, the following code is needed at startup:

.. code-block:: python

import jax
jax.config.update("jax_enable_x64", True)

This will be done automatically when importing certain modules in ParMOO, which are only compatible with double precision. However, in many use cases, 32-bit precision may be enough and provides substantial speedup in iteration tasks.

Once the precision is set, to get started, create a MOOP object.

.. code-block:: python

from parmoo import MOOP from parmoo.optimizers import LocalGPS

my_moop = MOOP(LocalGPS)

To summarize the framework, in each iteration ParMOO models each simulation using a computationally cheap surrogate, then solves one or more scalarizations of the objectives, which are specified by acquisition functions. Read more about this framework at our ReadTheDocs_ page. In the above example, LocalGPS is the class of optimizers that the my_moop will use to solve the scalarized surrogate problems.

Next, add design variables to the problem as follows using the MOOP.addDesign(*args) method. In this example, we define one continuous and one categorical design variable. Other options include integer, custom, and raw (using raw variables is not recommended except for expert users).

.. code-block:: python

Add a single continuous design variable in the range [0.0, 1.0]

my_moop.addDesign({'name': "x1", # optional, name 'des_type': "continuous", # optional, type of variable 'lb': 0.0, # required, lower bound 'ub': 1.0, # required, upper bound 'tol': 1.0e-8 # optional tolerance })

Add a second categorical design variable with 3 levels

my_moop.addDesign({'name': "x2", # optional, name 'des_type': "categorical", # required, type of variable 'levels': ["good", "bad"] # required, category names })

Next, add simulations to the problem as follows using the MOOP.addSimulation method. In this example, we define a toy simulation sim_func(x).

.. code-block:: python

import numpy as np from parmoo.searches import LatinHypercube from parmoo.surrogates import GaussRBF

Define a toy simulation for the problem, whose outputs are quadratic

def sim_func(x): if x["x2"] == "good": return np.array([(x["x1"] - 0.2) ** 2, (x["x1"] - 0.8) ** 2]) else: return np.array([99.9, 99.9])

Add the simulation to the problem

my_moop.addSimulation({'name': "MySim", # Optional name for this simulation 'm': 2, # This simulation has 2 outputs 'sim_func': sim_func, # Our sample sim from above 'search': LatinHypercube, # Use a LHS search 'surrogate': GaussRBF, # Use a Gaussian RBF surrogate 'hyperparams': {}, # Hyperparams passed to internals 'sim_db': { # Optional dict of precomputed points 'search_budget': 10 # Set search budget }, })

Now we can add objectives and constraints using MOOP.addObjective(*args) and MOOP.addConstraint(*args). In this example, there are 2 objectives (each corresponding to a single simulation output) and one constraint.

.. code-block:: python

First objective just returns the first simulation output

def f1(x, s): return s["MySim"][0] my_moop.addObjective({'name': "f1", 'obj_func': f1})

Second objective just returns the second simulation output

def f2(x, s): return s["MySim"][1] my_moop.addObjective({'name': "f2", 'obj_func': f2})

Add a single constraint, that x[0] >= 0.1

def c1(x, s): return 0.1 - x["x1"] my_moop.addConstraint({'name': "c1", 'constraint': c1})

Finally, we must add one or more acquisition functions using MOOP.addAcquisition(*args). These are used to scalarize the surrogate problems. The number of acquisition functions typically determines the number of simulation evaluations per batch. This is useful to know if you are using a

Related Skills

View on GitHub
GitHub Stars88
CategoryDevelopment
Updated20d ago
Forks11

Languages

Python

Security Score

100/100

Audited on Mar 6, 2026

No findings