PyNumDiff
Methods for numerical differentiation of noisy data in python
Install / Use
/learn @florisvb/PyNumDiffREADME
PyNumDiff
Python methods for numerical differentiation of noisy data, including multi-objective optimization routines for automated parameter selection.
<p align="center"> <a href="https://pynumdiff.readthedocs.io/master/"> <img alt="Python for Numerical Differentiation of noisy time series data" src="https://raw.githubusercontent.com/florisvb/PyNumDiff/master/logo.png" width="300" height="200" /> </a> </p> <p align="center"> <img src='https://github.com/florisvb/pynumdiff/actions/workflows/test.yml/badge.svg'/> <a href='https://pynumdiff.readthedocs.io/master/'> <img src='https://app.readthedocs.org/projects/pynumdiff/badge/?version=master' alt='Documentation Status' /></a> <a href='https://coveralls.io/github/florisvb/PyNumDiff?branch=master'> <img src='https://coveralls.io/repos/github/florisvb/PyNumDiff/badge.svg?branch=master' alt='Coverage Status' /></a> <a href="https://badge.fury.io/py/pynumdiff"> <img src="https://badge.fury.io/py/pynumdiff.svg" alt="PyPI"></a> <!--a href="https://doi.org/10.5281/zenodo.6374098"> <img src="https://zenodo.org/badge/DOI/10.5281/zenodo.6374098.svg" alt="DOI"></a--> <a href="https://joss.theoj.org/papers/102257ee4b0142bf49bc18d7c810e9d5"> <img src="https://joss.theoj.org/papers/102257ee4b0142bf49bc18d7c810e9d5/status.svg"></a> </p>Introduction
PyNumDiff is a Python package that implements many methods for computing numerical derivatives and smooth estimates of noisy data, which can be a critical step in developing dynamic models or designing control. There are seven different families of methods implemented in this repository:
- prefiltering followed by finite difference calculation
- iterated finite differencing
- polynomial fit methods
- basis function fit methods
- total variation regularization of a finite difference derivative
- generalized Kalman smoothing
- local approximation with linear model
All are ultimately smoothing with similar runtime and accuracy, but some have situational advantages over others: For example, robustdiff is specialized to handle outliers; splinediff, polydiff, rtsdiff, and robustdiff can handle missing data; splinediff, polydiff, rbfdiff, rtsdiff, and robustdiff can handle irregularly-spaced data; and rtsdiff can handle inputs on a wrapped domain, like angles. All methods can accept blocks of multidimensional data, differentiating all vectors along the dimension given by the axis parameter.
For a full list and comparison, see section 7 of our Taxonomy Paper and explore modules in the Sphinx documentation.
All methods have hyperparameters, so we take a principled approach and propose a multi-objective optimization framework for choosing settings that minimize a loss function to balance the faithfulness and smoothness of the derivative estimate. For more details, refer to this paper.
Installing
Dependencies are listed in pyproject.toml. They include the usual suspects like numpy and scipy, but also optionally cvxpy.
The code is compatible with >=Python 3.10. Install from PyPI with pip install pynumdiff, from source with pip install git+https://github.com/florisvb/PyNumDiff, or from local download with pip install .. Call pip install pynumdiff[advanced] to automatically install optional dependencies from the advanced list, like CVXPY.
Usage
For more details, read our Sphinx documentation. The basic pattern of all differentiation methods is:
somethingdiff(x, dt, **kwargs)
where x is data, dt is a step size, and various keyword arguments control the behavior. Some methods support variable step size, in which case the second parameter is renamed dt_or_t and can receive either a constant step size or an array of values to denote sample locations. All major methods support multidimensional data, so look for an axis argument to control the dimension differentiated along.
You can set the hyperparameters:
from pynumdiff.submodule import method
x_hat, dxdt_hat = method(x, dt, param1=val1, param2=val2, ...)
Or you can find hyperparameter settings by calling the multi-objective optimization algorithm from the optimize module:
from pynumdiff.optimize import optimize
# estimate cutoff_frequency by (a) counting the number of true peaks per second in the data or (b) look at power spectra and choose cutoff
tvgamma = np.exp(-1.6*np.log(cutoff_frequency) -0.71*np.log(dt) - 5.1) # see https://ieeexplore.ieee.org/abstract/document/9241009
params, val = optimize(somethingdiff, x, dt, tvgamma=tvgamma, # smoothness hyperparameter which defaults to None if dxdt_truth given
dxdt_truth=None, # give ground truth data if available, in which case tvgamma goes unused
search_space_updates={'param1':[vals], 'param2':[vals], ...})
print('Optimal parameters: ', params)
x_hat, dxdt_hat = somethingdiff(x, dt, **params)
If no search_space_updates is given, a default search space is used. See the top of optimize.py.
The following heuristic works well for choosing tvgamma, where cutoff_frequency is the highest frequency content of the signal in your data, and dt is the timestep: tvgamma=np.exp(-1.6*np.log(cutoff_frequency)-0.71*np.log(dt)-5.1). Larger values of tvgamma produce smoother derivatives. The value of tvgamma is largely universal across methods, making it easy to compare method results. Be aware the optimization is a fairly heavy process.
Notebook examples
Much more extensive usage is demonstrated in Jupyter notebooks:
- Differentiation with different methods: 1_basic_tutorial.ipynb
- Parameter Optimization: 2_optimizing_hyperparameters.ipynb
See the README in the notebooks/ folder for a full guide to all demos and experiments.
Repo Structure
.github/workflowscontains.yamlthat configures our GitHub Actions continuous integration (CI) runs.docs/containsmakefiles and.rstfiles to govern the waysphinxbuilds documentation, either locally by navigating to this folder and callingmake htmlor in the cloud byreadthedocs.io.notebooks/contains Jupyter notebooks that demonstrate some usage of the library.pynumdiff/contains the source code. For a full list of modules and further navigation help, see the readme in this subfolder..coveragercgovernscoverageruns, listing files and functions/lines that should be excluded, e.g. plotting code..editorconfigensures tabs are displayed as 4 characters wide..gitignoreensures files generated by localpip installs, Jupyter notebook runs, caches from code runs, virtual environments, and more are not picked up bygitand accidentally added to the repo..pylintrcconfigurespylint, a tool for autochecking code quality..readthedocs.yamlconfiguresreadthedocsand is necessary for documentation to get auto-rebuilt.CITATION.cffis citation information for the Journal of Open-Source Software (JOSS) paper associated with this project.LICENSE.txtallows free usage of this project.README.mdis the text you're reading, hello.pyproject.tomlgoverns how this package is set up and installed, including dependencies.
Citation
See CITATION.cff file as well as the following references.
PyNumDiff python package:
@article{PyNumDiff2022,
doi = {10.21105/joss.04078},
url = {https://doi.org/10.21105/joss.04078},
year = {2022},
publisher = {The Open Journal},
volume = {7},
number = {71},
pages = {4078},
author = {Floris van Breugel and Yuying Liu and Bingni W. Brunton and J. Nathan Kutz},
title = {PyNumDiff: A Python package for numerical differentiation of noisy time-series data},
journal = {Journal of Open Source Software}
}
Collection of numerical differentiation methods:
@misc{komarov2025taxonomynumericaldifferentiationmethods,
title={A Taxonomy of Numerical Differentiation Methods},
author={Pavel Komarov and Floris van Breugel and J. Nathan Kutz},
year={2025},
eprint={2512.09090},
archivePrefix={arXiv},
primaryClass={math.NA},
url={https://arxiv.org/abs/2512.09090}
}
Optimization algorithm:
@article{ParamOptimizationDerivatives2020,
doi={10.1109/ACCESS.2020.3034077}
author={F. {van Breugel} and J. {Nathan Kutz} and B. W. {Brunton}},
journal={IEEE Access},
title={Numerical differentiation of noisy data: A unifying multi-objective optimization framework},
year={2020}
}
Running the tests
We are using GitHub Actions for continuous intergration testing.
Run tests locally by navigating to the repo in a terminal and calling
> pytest -s
Add the flag --plot to see plots of the methods against test functions. Add the flag --bounds to print $\log$ error bounds (useful when changing method behavior).
License
This project utilizes the MIT LICENSE. 100% open-source, feel free to utilize the code however you like.
Related Skills
node-connect
351.4kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
110.7kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
351.4kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
351.4kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
