Neurodiffeq
A library for solving differential equations using neural networks based on PyTorch, used by multiple research groups around the world, including at Harvard IACS.
Install / Use
/learn @NeuroDiffGym/NeurodiffeqREADME
neurodiffeq
Citation
A 2nd NeuroDiffEq paper has been published. Please make sure to cite both if you have been using features that became available after the 2020 version.
@article{chen2020neurodiffeq,
title={NeuroDiffEq: A Python package for solving differential equations with neural networks},
author={Chen, Feiyu and Sondak, David and Protopapas, Pavlos and Mattheakis, Marios and Liu, Shuheng and Agarwal, Devansh and Di Giovanni, Marco},
journal={Journal of Open Source Software},
volume={5},
number={46},
pages={1931},
year={2020}
}
@article{liu2025recent,
title={Recent Advances of NeuroDiffEq--An Open-Source Library for Physics-Informed Neural Networks},
author={Liu, Shuheng and Protopapas, Pavlos and Sondak, David and Chen, Feiyu},
journal={arXiv preprint arXiv:2502.12177},
year={2025}
}
🔥🔥🔥Did you know that neurodiffeq supports solution bundles and can be used to solve reverse problems? See here!
:mortar_board: Already familiar with neurodiffeq? :point_down: Jump to FAQs.
Introduction
neurodiffeq is a package for solving differential equations with neural networks. Differential equations are equations that relate some function with its derivatives. They emerge in various scientific and engineering domains. Traditionally these problems can be solved by numerical methods (e.g. finite difference, finite element). While these methods are effective and adequate, their expressibility is limited by their function representation. It would be interesting if we can compute solutions for differential equations that are continuous and differentiable.
As universal function approximators, artificial neural networks have been shown to have the potential to solve ordinary differential equations (ODEs) and partial differential equations (PDEs) with certain initial/boundary conditions. The aim of neurodiffeq is to implement these existing techniques of using ANN to solve differential equations in a way that allow the software to be flexible enough to work on a wide range of user-defined problems.
Installation
Using pip
Like most standard libraries, neurodiffeq is hosted on PyPI. To install the latest stable relesase,
pip install -U neurodiffeq # '-U' means update to latest version
Manually
Alternatively, you can install the library manually to get early access to our new features. This is the recommended way for developers who want to contribute to the library.
git clone https://github.com/NeuroDiffGym/neurodiffeq.git
cd neurodiffeq && pip install -r requirements
pip install . # To make changes to the library, use `pip install -e .`
pytest tests/ # Run tests. Optional.
Getting Started
We are happy to help you with any questions. In the meantime, you can checkout the FAQs.
To view complete tutorials and documentation of neurodiffeq, please check Official Documentation.
In addition to the documentations, we have recently made a quick walkthrough Demo Video with slides.
Example Usages
Imports
from neurodiffeq import diff
from neurodiffeq.solvers import Solver1D, Solver2D
from neurodiffeq.conditions import IVP, DirichletBVP2D
from neurodiffeq.networks import FCNN, SinActv
ODE System Example
Here we solve a non-linear system of two ODEs, known as the Lotka–Volterra equations. There are two unknown functions (u and v) and a single independent variable (t).
def ode_system(u, v, t):
return [diff(u,t)-(u-u*v), diff(v,t)-(u*v-v)]
conditions = [IVP(t_0=0.0, u_0=1.5), IVP(t_0=0.0, u_0=1.0)]
nets = [FCNN(actv=SinActv), FCNN(actv=SinActv)]
solver = Solver1D(ode_system, conditions, t_min=0.1, t_max=12.0, nets=nets)
solver.fit(max_epochs=3000)
solution = solver.get_solution()
solution is a callable object, you can pass in numpy arrays or torch tensors to it like
u, v = solution(t, to_numpy=True) # t can be np.ndarray or torch.Tensor
Plotting u and v against their analytical solutions yields something like:

PDE System Example
Here we solve a Laplace Equation with Dirichlet boundary conditions on a rectangle. Note that we choose Laplace equation for its simplicity of computing analytical solution. In practice, you can attempt any nonlinear, chaotic PDEs, provided you tune the solver well enough.
Solving a 2-D PDE system is quite similar to solving ODEs, except there are two variables x and y for boundary value problems or x and t for initial boundary value problems, both of which are supported.
def pde_system(u, x, y):
return [diff(u, x, order=2) + diff(u, y, order=2)]
conditions = [
DirichletBVP2D(
x_min=0, x_min_val=lambda y: torch.sin(np.pi*y),
x_max=1, x_max_val=lambda y: 0,
y_min=0, y_min_val=lambda x: 0,
y_max=1, y_max_val=lambda x: 0,
)
]
nets = [FCNN(n_input_units=2, n_output_units=1, hidden_units=(512,))]
solver = Solver2D(pde_system, conditions, xy_min=(0, 0), xy_max=(1, 1), nets=nets)
solver.fit(max_epochs=2000)
solution = solver.get_solution()
The signature of solution for a 2D PDE is slightly different from that of an ODE. Again, it takes in either numpy arrays or torch tensors.
u = solution(x, y, to_numpy=True)
Evaluating u on [0,1] × [0,1] yields the following plots
| ANN-Based Solution | Residual of PDE |
| :-------------------------------------------------: | :----------------------------------------------------------: |
|
|
|
Using a Monitor
A monitor is a tool for visualizing PDE/ODE solutions as well as history of loss and custom metrics during training. Jupyter Notebooks users need to run the %matplotlib notebook magic. For Jupyter Lab users, try %matplotlib widget.
from neurodiffeq.monitors import Monitor1D
...
monitor = Monitor1D(t_min=0.0, t_max=12.0, check_every=100)
solver.fit(..., callbacks=[monitor.to_callback()])
You should see the plots update every 100 epoch as well as on the last epoch, showing two plots — one for solution visualization on the interval [0,12] and the other for loss history (training and validation).

Custom Networks
For convenience, we have implemented an FCNN – fully-connected neural network, whose hidden units and activation functions can be customized.
from neurodiffeq.networks import FCNN
# Default: n_input_units=1, n_output_units=1, hidden_units=[32, 32], activation=torch.nn.Tanh
net1 = FCNN(n_input_units=..., n_output_units=..., hidden_units=[..., ..., ...], activation=...)
...
nets = [net1, net2, ...]
FCNN is usually a good starting point. For advanced users, solvers are compatible with any custom torch.nn.Module. The only constraints are:
-
The modules takes in a tensor of shape
(None, n_coords)and the outputs a tensor of shape(None, 1). -
There must be a total of
n_funcsmodules innetsto be passed tosolver = Solver(..., nets=nets).

Acutally, neurodiffeq has a single_net feature that doesn't obey the above rules, which won't be covered here.
Read the PyTorch tutorial on building your own network (a.k.a module) architecture.
Transfer Learning
Transfer learning is easily done by serializing old_solver.nets (a list of torch modules) to disk and then loading them and passing to a new solver:
old_solver.fit(max_epochs=...)
# ... dump `old_solver.nets` to disk
# ... load the networks from disk, store them in some `loaded_nets` variable
new_solver = Solver(..., nets=loaded_nets)
new_solver.fit(max_epochs=...)
We currently working on wrapper functions to save/load networks and other internal variables of Solvers. In the meantime, you can read the PyTorch tutorial on saving and loading your networks.
Sampling Strategies
In neurodiffeq, the networks are trained by minimizing loss (ODE/PDE re
