ConvolutionalNeuralOperator
This repository is the official implementation of the paper Convolutional Neural Operators for robust and accurate learning of PDEs
Install / Use
/learn @camlab-ethz/ConvolutionalNeuralOperatorREADME
Convolutional Neural Operators for robust and accurate learning of PDEs
- This repository is the official implementation of the paper Convolutional Neural Operators for robust and accurate learning of PDEs
- The paper was presented at NeurIPS 2023
- Representative PDE Benchmarks (RPB) are available at this link
- Read our blog about CNOs at this link
- This repository also covers CNO codes used in the paper Poseidon: Efficient Foundation Models for PDEs
- Vist this link for time-dependent CNO and CNO - foundation model
- Visit Poseidon github page
The CNO is tested on a novel set of benchmarks, termed as Representative PDE Benchmarks (RPB). The CNO is either on-par or outperformed the tested baselines on all the benchmarks, both when testing in-distribution as well as in out-of-distribution testing.
<p align="center"> <img src="/figures/table.png" width="750"/> </p> <p align="center"> <em>Relative median L¹ test errors, for both in- and out-of-distribution testing, for different benchmarks and models.</em> </p> <br />We assess the test errors of the CNO and other baselines at different testing resolutions notably, for the Navier-Stokes equations benchmarks. We observe that in this case, the CNO is the only model that demonstrates approximate error invariance with respect to test resolution.
<p align="center"> <img src="/figures/resolution_NS.png" width="500"/> </p> <p align="center"> <em>The CNO model has almost constant testing error across different resolutions (Navier-Stokes).</em> </p> <br />Code Instructions:
-
The original CNO code from NeurIPS 2023 is located in the folder CNO2d_classic
- All the instructions for this version can be found in the readme.md file in the folder
- The code is more complex to configure compared to the vanilla CNO code (see below)
-
Vanilla CNO2d and CNO1d versions are located in the folders CNO2d_simplified and CNO1d_simplified
- All the instructions for these versions can be found in the readme.md files in the folders
- The models are termed as "vanilla CNO" as the interpolation filters cannot be manually designed
- The codes do not utilize the CUDA kernel, making them simple to configure
-
Codes for Time-Dependent CNO2d are located in the folder - CNO2d_temporal
- The Time-Dependent CNO and CNO - Foundation Model codes are used in the paper Poseidon: Efficient Foundation Models for PDEs
- All the instructions for the Time-Dependent CNO can be found in the readme.md file in the folder
-
Codes for CNO - Foundation Model are located in the folder - CNO2d_temporal
- All the instructions for the CNO - Foundation Model can be found in the readme.md file in the folder
- Codes for finetuning the CNO-FM are also located in the folder.
- One can download the weights CNO-Foundation Model (109M) on this link.
-
Codes for the other baselines are located in the folder _OtherModels
Datasets
We cover instances of the Poisson, Wave, Navier-Stokes, Allen-Cahn, Transport and Compressible Euler equations and Darcy flow. Data can be downloaded from https://zenodo.org/records/10406879 (~2.4GB).
Alternatively, run the script download_data.py which downloads all required data into the appropriate folder (it requires 'wget' to be installed on your system).
python3 download_data.py
Poseidon: Efficient Foundation Models for PDEs
We also provide all datasets used in the paper Poseidon: Efficient Foundation Models for PDEs on the 🤗 Hub. You can download them from the respective collections:
Please also visit Poseidon github page.
Citation
If you use our models, code, or datasets, please consider citing our paper:
@misc{CNO,
title={Convolutional Neural Operators for robust and accurate learning of PDEs},
author={Bogdan Raonić and Roberto Molinaro and Tim De Ryck and Tobias Rohner and Francesca Bartolucci and Rima Alaifari and Siddhartha Mishra and Emmanuel de Bézenac},
year={2023},
eprint={2302.01178},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
