VeriGauge
A united toolbox for running major robustness verification approaches for DNNs. [S&P 2023]
Install / Use
/learn @AI-secure/VeriGaugeREADME
VeriGauge: Unified Toolbox for Representative Robustness Verification Approaches for Deep Neural Networks
Current Maintainer: llylly (linyi2@illinois.edu, linyil.com) @ Secure Learning Lab, UIUC
This is a unified toolbox for representative robustness verification approaches for DNNs. The leaderboard for different approaches can be found here: https://github.com/AI-secure/Provable-Training-and-Verification-Approaches-Towards-Robust-Neural-Networks.
Related paper: SoK: Certified Robustness for Deep Neural Networks.
@inproceedings{li2023sok,
title={SoK: Certified Robustness for Deep Neural Networks},
author={Linyi Li and Tao Xie and Bo Li},
booktitle={44th {IEEE} Symposium on Security and Privacy, {SP} 2023, San Francisco, CA, USA, 22-26 May 2023},
publisher={IEEE},
year={2023}
}
-
What is robustness verification for DNNs?
DNNs are vulnerable to adversarial examples. Given a model and an input x0, the robustness verification approaches can certify that there are no adversarial samples around x0 within radius r. The complete verification of DNNs is NP-complete [1,2]. Therefore, current verification approaches usually leverage relaxations, which results in outputting smaller r than the real one.
-
What neural networks are supported?
Currently, existing approaches mainly support image classification tasks for MNIST, CIFAR-10, and ImageNet, and our toolbox supports all of them though networks for ImageNet are usually too large for standard non-probabilistic verification approaches. Though a significant amount of verification approaches support skip connections, max-pooling layers, etc, typical verification approaches mainly work on feed-forward neural networks with ReLU activations, containing only fully-connected layers and convolutional layers.
Main Features:
- A unified lightweight platform for running about 20 verification approaches in a simple PyTorch-based interface.
- Easily extensible to your own customized neural networks.
- Easily extensible to your own verification approaches.
- High-efficiency benefited from the lightweight structure.
[1] Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks. https://arxiv.org/abs/1702.01135.
[2] Towards Fast Computation of Certified Robustness for ReLU Networks. http://proceedings.mlr.press/v80/weng18a.html.
Supported Approach List
| Approach Type | ClassName | Description & Path | Comments |
| ------------------ | -------------------- | ------------------------------------------------------------ | ---------------------------------------------------------- |
| Normal Infernace | CleanAdaptor | The normal inference of the model. Implemented in adaptor.basic_adaptor. | Not a verification approach. |
| Empirical Attack | PGDAdaptor | Based on Python toolbox cleverhans. Implemented in adaptor.basic_adaptor. | Not a verification approach, just provide upper bound of r |
| Empirical Attack | CWAdaptor | Based on Python toolbox cleverhans. Implemented in adaptor.basic_adaptor. | Not a verification approach, just provide upper bound of r |
| MILP | MILPAdaptor | Reimplementation of Tjeng et al's MILP-based verification based on Python's Gurobi API. Adaptor in adaptor.basic_adaptor. Core in basic.fastmilp.MILPVerifier. | Complete Verification |
| SDP | PercySDPAdaptor | Reimplementation of Raghunathan et al's SDP-based verification based on cvxpy. Adaptor in adaptor.basic_adaptor. Core in basic.percysdp. | |
| SDP | FazlybSDPAdaptor | Reimplementation of Fazlyb et al's SDP-based verification based on cvxpy. Adaptor in adaptor.basic_adaptor. Core in basic.components.BaselinePointVerifierExt. | |
| Linear-Based | FastLinIBPAdaptor | Reimplementation of the combination of IBP and FastLin bound ($l$ and $u$ per layer are the maximum or minimum of two bounds respectively). Adaptor in adaptor.basic_adaptor. Core in basic.intervalbound. | |
| Linear-Based | FastLinAdaptor | Weng et al's linear bound propagation based verification approach. Adaptor in adaptor.recurjac_adaptor. Core in recurjac/. | |
| Linear-Based | FastLinSparseAdaptor | Weng et al's linear bound propagation based verification approach. This implementation is accelerated by sparse matrix multiplication. Adaptor in adaptor.cnncert_adaptor. Core in cnn_cert/. | |
| Linear-Based | CNNCertAdaptor | Boopathy et al's linear bound propagation based verification approach. This approach extends FastLin and CROWN to general CNN/Residual/Sigmoid neural networks with high efficiency. Adaptor in adaptor.cnncert_adaptor. Core in cnn_cert/. | |
| Linear-Based | LPAllAdaptor | LP-full verification approach, which computes $l$ and $u$ layerwise by linear programming. It is mentioned by Boopathy et al, Weng et al, and analyzed by Salman et al. Here, the adaptor is in adaptor.cnncert_adaptor. Core in cnn_cert/ (we use Boopathy et al's implementation). | |
| Linear-Based | ZicoDualAdaptor | Wong et al's linear dual-based verification approach. Adaptor in adaptor.lpdual_adaptor. Core in convex_adversarial/. | |
| Linear-Based | FullCrownAdaptor | Zhang et al's linear bound propagation based verification approach. Adaptor in adaptor.crown_adaptor. Core in crown_ibp/. | |
| Linear-Based | CrownIBPAdaptor | Zhang et al's linear + interval bound propagation based verification approach. Adaptor in adaptor.crown_adaptor. Core in crown_ibp/. | |
| Linear-Based | IBPAdaptor | Gowal et al's interval propagation based verification approach. Adaptor in adaptor.crown_adaptor. Core in crown_ibp/. The re-implementation of our own is available at adaptor.basic_adaptor.IBPAdaptor, which has similar performance as Zhang et al's and Gowal et al's implementation. For simplicity, by default it uses Zhang et al's implementation. | |
| Lipschitz | FastLipAdaptor | Weng et al's Lipschitz based verification approach. Adaptor in adaptor.recurjac_adaptor. Core in recurjac/. | |
| Lipschitz | RecurJacAdaptor | Zhang et al's Lipschitz based verification approach. Adaptor in adaptor.recurjac_adaptor. Core in recurjac/. | |
| Lipschitz | SpectralAdaptor | Szegedy et al's Lipschitz (spectral bound) based verification approach. Adaptor in adaptor.recurjac_adaptor. Core in recurjac/ (we leverage Zhang et al's implementation). | |
| Branch and Bound | AI2Adaptor | Gehr et al's branch-and-bound based complete verification approach (concretely, domain of set of polyhedra). Adaptor in adatpro.eran_adaptor. Core in eran/. | Complete Verification |
| Linear-Based | DeepPolyAdaptor | Singh et al's linear relaxation-based verification approach. Adaptor in adatpro.eran_adaptor. Core in eran/. | |
| Hybrid | RefineZonoAdaptor | Singh et al's linear relaxation + MILP + IBP hybrid verification approach. Adaptor in adatpro.eran_adaptor. Core in eran/. | |
| Linear-Based | KReluAdaptor | Singh et al's linear relaxation based verification approach with $l$ and $u$ refinement from multiple neuron's relaxation. Adaptor in adatpro.eran_adaptor. Core in eran/. | |
Prerequisites
-
Find a server with GPU support and Linux / MacOS system. (The toolbox has been tested on Linux and MacOS. It should be possible to run on Windows, but we can't guarantee so.)
-
Prepare necessary datasets:
-
If you only want to benchmark on MNIST and CIFAR-10, don't need to do anything, since PyTorch will automatically download them later.
-
If you want to benchmark on ImageNet, in
datasets.py, p
-
