SkillAgentSearch skills...

Mici

Manifold Markov chain Monte Carlo methods in Python

Install / Use

/learn @matt-graham/Mici

README

<div style="text-align: center;" align="center"> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/matt-graham/mici/main/images/mici-logo-rectangular-light-text.svg"> <source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/matt-graham/mici/main/images/mici-logo-rectangular.svg"> <img alt="Mici logo" src="https://raw.githubusercontent.com/matt-graham/mici/main/images/mici-logo-rectangular.svg" width="400px"> </picture> <div style="text-align: center;" align="center">

PyPI version Zenodo DOI Documentation License Test status Linting status Docs status codecov Contributor Covenant

</div> </div>

Mici is a Python package providing implementations of Markov chain Monte Carlo (MCMC) methods for approximate inference in probabilistic models, with a particular focus on MCMC methods based on simulating Hamiltonian dynamics on a manifold.

Features

Key features include

  • a modular design allowing use of a wide range of inference algorithms by mixing and matching different components, and making it easy to extend the package,
  • a pure Python code base with minimal dependencies, allowing easy integration within other code,
  • built-in support for several automatic differentiation frameworks, including JAX and Autograd, or the option to supply your own derivative functions,
  • implementations of MCMC methods for sampling from distributions on embedded manifolds implicitly-defined by a constraint equation and distributions on Riemannian manifolds with a user-specified metric,
  • computationally efficient inference via transparent caching of the results of expensive operations and intermediate results calculated in derivative computations allowing later reuse without recalculation,
  • memory efficient inference for large models by memory-mapping chains to disk, allowing long runs on large models without hitting memory issues.

Installation

To install and use Mici the minimal requirements are a Python 3.10+ environment with NumPy and SciPy installed. The latest Mici release on PyPI (and its dependencies) can be installed in the current Python environment by running

pip install mici

To instead install the latest development version from the main branch on Github run

pip install git+https://github.com/matt-graham/mici

If available in the installed Python environment the following additional packages provide extra functionality and features

  • ArviZ: if ArviZ is available the traces (dictionary) output of a sampling run can be directly converted to an arviz.InferenceData container object using arviz.convert_to_inference_data or implicitly converted by passing the traces dictionary as the data argument to ArviZ API functions, allowing straightforward use of the ArviZ's extensive visualisation and diagnostic functions.
  • Autograd: if available Autograd will be used to automatically compute the required derivatives of the model functions (providing they are specified using functions from the autograd.numpy and autograd.scipy interfaces). To sample chains in parallel using autograd functions you also need to install multiprocess. This will cause multiprocess.Pool to be used in preference to the in-built mutiprocessing.Pool for parallelisation as multiprocess supports serialisation (via dill) of a much wider range of types, including of Autograd generated functions. Both Autograd and multiprocess can be installed alongside Mici by running pip install mici[autograd].
  • JAX: if available JAX will be used to automatically compute the required derivatives of the model functions (providing they are specified using functions from the jax interface). To sample chains parallel using JAX functions you also need to install multiprocess, though note due to JAX's use of multithreading which is incompatible with forking child processes, this can result in deadlock. Both JAX and multiprocess can be installed alongside Mici by running pip install mici[jax]. Alternatively if using a free-threaded build of Python such as Python 3.13t, thread-based parallelism can be used instead which both avoids issues with using pickle to serialize JAX objects and avoids deadlocks when forking processes.
  • SymNum: if available SymNum will be used to automatically compute the required derivatives of the model functions (providing they are specified using functions from the symnum.numpy interface). Symnum can be installed alongside Mici by running pip install mici[symnum].

Why Mici?

Mici is named for Augusta 'Mici' Teller, who along with Arianna Rosenbluth developed the code for the MANIAC I computer used in the seminal paper Equations of State Calculations by Fast Computing Machines which introduced the first example of a Markov chain Monte Carlo method.

Related projects

Other Python packages for performing MCMC inference include PyMC, PyStan (the Python interface to Stan), Pyro / NumPyro, TensorFlow Probability, emcee, Sampyl and BlackJAX.

Unlike PyMC, PyStan, (Num)Pyro and TensorFlow Probability which are complete probabilistic programming frameworks including functionality for defining a probabilistic model / program, but like emcee, Sampyl and BlackJAX, Mici is solely focused on providing implementations of inference algorithms, with the user expected to be able to define at a minimum a function specifying the negative log (unnormalized) density of the distribution of interest.

Further while PyStan, (Num)Pyro and TensorFlow Probability all push the sampling loop into external compiled non-Python code, in Mici the sampling loop is run directly within Python. This has the consequence that for small models in which the negative log density of the target distribution and other model functions are cheap to evaluate, the interpreter overhead in iterating over the chains in Python can dominate the computational cost, making sampling much slower than packages which outsource the sampling loop to a efficient compiled implementation.

Overview of package

API documentation for the package is available here. Mici provides both a high-level functional interface and a more customizable but verbose object-oriented interface. The functions mici.sample_hmc_chains and mici.sample_constrained_hmc_chains in the high-level interface, allow straightforward sampling from distributions on unconstrained and constrained spaces respectively. These functions default to using an adaptive HMC sampler which dynamically sets trajectory lengths, which should work well for many problems.

Alternatively users can explicitly create instances of the underlying classes used to implement MCMC sampling schemes in Mici to allow greater control over their behaviour. The three main user-facing modules within the mici package are the systems, integrators and samplers modules and you will generally need to create an instance of one class from each module when using the object-oriented interface.

mici.systems - Hamiltonian systems encapsulating model functions and their derivatives

  • EuclideanMetricSystem - systems with a metric on the position space with a constant matrix representation,
  • GaussianEuclideanMetricSystem - systems in which the target distribution is defined by a density with respect to the standard Gaussian measure on the position space allowing analytically solving for flow correspondi
View on GitHub
GitHub Stars236
CategoryDevelopment
Updated1mo ago
Forks28

Languages

Python

Security Score

100/100

Audited on Feb 16, 2026

No findings