SkillAgentSearch skills...

Dit

Python package for information theory.

Install / Use

/learn @dit/Dit
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

dit is a Python package for information theory.

|build| |codecov| |docs| |conda|

|joss| |zenodo| |slack|

Introduction

Information theory is a powerful extension to probability and statistics, quantifying dependencies among arbitrary random variables in a way that is consistent and comparable across systems and scales. Information theory was originally developed to quantify how quickly and reliably information could be transmitted across an arbitrary channel. The demands of modern, data-driven science have been coopting and extending these quantities and methods into unknown, multivariate settings where the interpretation and best practices are not known. For example, there are at least four reasonable multivariate generalizations of the mutual information, none of which inherit all the interpretations of the standard bivariate case. Which is best to use is context-dependent. dit implements a vast range of multivariate information measures in an effort to allow information practitioners to study how these various measures behave and interact in a variety of contexts. We hope that having all these measures and techniques implemented in one place will allow the development of robust techniques for the automated quantification of dependencies within a system and concrete interpretation of what those dependencies mean.

Citing

If you use dit in your research, please cite it as::

@article{dit, Author = {James, R. G. and Ellison, C. J. and Crutchfield, J. P.}, Title = {{dit}: a {P}ython package for discrete information theory}, Journal = {The Journal of Open Source Software}, Volume = {3}, Number = {25}, Pages = {738}, Year = {2018}, Doi = {https://doi.org/10.21105/joss.00738} }

Basic Information

Documentation


http://docs.dit.io

Downloads


https://pypi.org/project/dit/

https://anaconda.org/conda-forge/dit

+-------------------------------------------------------------------+ | Dependencies | +===================================================================+ | * Python 3.9+ | | * boltons <https://boltons.readthedocs.io>_ | | * debtcollector <https://docs.openstack.org/debtcollector/>_ | | * lattices <https://github.com/dit/lattices>_ | | * loguru <https://loguru.readthedocs.io>_ | | * networkx <https://networkx.github.io/>_ | | * numpy <http://www.numpy.org/>_ | | * PLTable <https://github.com/platomav/PLTable>_ | | * scipy <https://www.scipy.org/>_ | +-------------------------------------------------------------------+

Optional Dependencies

* colorama: colored column heads in PID indicating failure modes
* cython: faster sampling from distributions
* hypothesis: random sampling of distributions
* jax, jaxlib: JAX-based optimization backend with autodiff support
* matplotlib, python-ternary: plotting of various information-theoretic expansions
* numdifftools: numerical evaluation of gradients and hessians during optimization
* pint: add units to informational values
* scikit-learn: faster nearest-neighbor lookups during entropy/mutual information estimation from samples
* torch: PyTorch-based optimization backend with autodiff and GPU support
* xarray: ``Distribution`` class for labeled, algebra-friendly distributions

Install
*******

The easiest way to install is:

.. code-block:: bash

  pip install dit

If you want to install ``dit`` within a conda environment, you can simply do:

.. code-block:: bash

  conda install -c conda-forge dit

For development, we recommend `uv <https://docs.astral.sh/uv/>`_:

.. code-block:: bash

  git clone https://github.com/dit/dit.git
  cd dit
  uv sync --extra dev

This installs ``dit`` in editable mode with all development dependencies
(tests, docs, linting, type checking, and optional backends).

Testing
*******

.. code-block:: bash

  # Using uv (recommended)
  uv run pytest

  # Or with pip
  pip install -e ".[test]"
  pytest

Code and bug tracker
********************

https://github.com/dit/dit

License
*******

BSD 3-Clause, see LICENSE.txt for details.

Implemented Measures
--------------------

``dit`` implements the following information measures. Most of these are implemented in multivariate & conditional
generality, where such generalizations either exist in the literature or are relatively obvious --- for example,
though it is not in the literature, the multivariate conditional exact common information is implemented here.

+------------------------------------------+-----------------------------------------+-----------------------------------+
| Entropies                                | Mutual Informations                     | Divergences                       |
|                                          |                                         |                                   |
| * Shannon Entropy                        | * Co-Information                        | * Variational Distance            |
| * Renyi Entropy                          | * Interaction Information               | * Kullback-Leibler Divergence \   |
| * Tsallis Entropy                        | * Total Correlation /                   |   Relative Entropy                |
| * Necessary Conditional Entropy          |   Multi-Information                     | * Cross Entropy                   |
| * Residual Entropy /                     | * Dual Total Correlation /              | * Jensen-Shannon Divergence       |
|   Independent Information /              |   Binding Information                   | * Earth Mover's Distance          |
|   Variation of Information               | * CAEKL Multivariate Mutual Information +-----------------------------------+
+------------------------------------------+-----------------------------------------+ Other Measures                    |
| Common Informations                      | Partial Information Decomposition       |                                   |
|                                          |                                         | * Channel Capacity                |
| * Gacs-Korner Common Information         | * :math:`I_{min}`                       | * Complexity Profile              |
| * Wyner Common Information               | * :math:`I_{\wedge}`                    | * Connected Informations          |
| * Exact Common Information               | * :math:`I_{RR}`                        | * Copy Mutual Information         |
| * Functional Common Information          | * :math:`I_{\downarrow}`                | * Cumulative Residual Entropy     |
| * MSS Common Information                 | * :math:`I_{proj}`                      | * Extropy                         |
+------------------------------------------+ * :math:`I_{BROJA}`                     | * Hypercontractivity Coefficient  |
| Secret Key Agreement Bounds              | * :math:`I_{ccs}`                       | * Information Bottleneck          |
|                                          | * :math:`I_{\pm}`                       | * Information Diagrams            |
| * Secrecy Capacity                       | * :math:`I_{sx}`                        | * Information Trimming            |
| * Intrinsic Mutual Information           | * :math:`I_{dep}`                       | * Lautum Information              |
| * Reduced Intrinsic Mutual Information   | * :math:`I_{RAV}`                       | * LMPR Complexity                 |
| * Minimal Intrinsic Mutual Information   | * :math:`I_{mmi}`                       | * Marginal Utility of Information |
| * Necessary Intrinsic Mutual Information | * :math:`I_{\prec}`                     | * Maximum Correlation             |
| * Two-Part Intrinsic Mutual Information  | * :math:`I_{RA}`                        | * Maximum Entropy Distributions   |
|                                          | * :math:`I_{SKAR}`                      | * Perplexity                      |
|                                          | * :math:`I_{IG}`                        | * Rate-Distortion Theory          |
|                                          | * :math:`I_{RDR}`                       | * TSE Complexity                  |
+------------------------------------------+-----------------------------------------+-----------------------------------+

Quickstart
----------

The basic usage of ``dit`` corresponds to creating distributions, modifying them
if need be, and then computing properties of those distributions. First, we
import:

.. code:: python

   >>> import dit

Suppose we have a really thick coin, one so thick that there is a reasonable
chance of it landing on its edge. Here is how we might represent the coin in
``dit``.

.. code:: python

   >>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
   >>> print(d)
   Class:          Distribution
   Alphabet:       ('E', 'H', 'T') for all rvs
   Base:           linear
   Outcome Class:  str
   Outcome Length: 1
   RV Names:       None

   x   p(x)
   E   0.2
   H   0.4
   T   0.4

Calculate the probability of ``H`` and also of the combination ``H or T``.

.. code:: python

   >>> d['H']
   0.4
   >>> d.event_probability(['H','T'])
   0.8

Calculate the Shannon entropy and extropy of the joint distribution.

.. code:: python

   >>> dit.shannon.entropy(d)
   1.5219280948873621
   >>> dit.other.extropy(d)
   1.1419011889093373

Create a distribution where ``Z = xor(X, Y)``.

.. code:: python

   >>> import dit.example_dists
   >>> d = dit.example_dists.Xor()
   >>> d.set_rv_names(['X', 'Y', 'Z'])
   >>> print(d)
   Class:          Distribution
   Alphabet:       ('0', '1') for all rvs
   Base:           linear
   Outcome Class:  str
   Outcome Length: 3
   RV Names:   
View on GitHub
GitHub Stars570
CategoryDevelopment
Updated3h ago
Forks93

Languages

Python

Security Score

100/100

Audited on Apr 2, 2026

No findings