SkillAgentSearch skills...

DyRep

Official implementation for paper "DyRep: Bootstrapping Training with Dynamic Re-parameterization", CVPR 2022

Install / Use

/learn @hunto/DyRep
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

DyRep: Bootstrapping Training with Dynamic Re-parameterization

Official implementation for paper "DyRep: Bootstrapping Training with Dynamic Re-parameterization", CVPR 2022.

By Tao Huang, Shan You, Bohan Zhang, Yuxuan Du, Fei Wang, Chen Qian, Chang Xu.

:fire: Training code is available here.

<p align='center'> <img src='./assests/DyRep_framework.png' alt='DyRep Framework' width='1000px'> </p>

Updates

March 11, 2022

The code is available at image_classification_sota.

Getting started

Clone training code

git clone https://github.com/hunto/DyRep.git --recurse-submodules
cd DyRep/image_classification_sota

Then prepare your environment and datasets following the README.md in image_classification_sota.

Implementation of DyRep

The core concept of DyRep is in lib/models/utils/dyrep.py.

Reproducing our results

CIFAR

|Dataset|Model|Config|Paper|This repo|Log| |:--:|:--:|:--:|:--:|:--:|:--:| |CIFAR-10|VGG-16|config|95.22%|95.37%|log| |CIFAR-100|VGG-16|config|74.37%|74.60%|log|

  • CIFAR-10
    sh tools/dist_train.sh 1 configs/strategies/DyRep/cifar.yaml nas_model --model-config configs/models/VGG/vgg16_cifar10.yaml --dyrep --experiment dyrep_cifar10_vgg16
    
  • CIFAR-100
    sh tools/dist_train.sh 1 configs/strategies/DyRep/cifar.yaml nas_model --model-config configs/models/VGG/vgg16_cifar100.yaml --dyrep --dataset cifar100 --experiment dyrep_cifar100_vgg16
    

ImageNet

|Dataset|Model|Config|Paper|This repo|Log| |:--:|:--:|:--:|:--:|:--:|:--:| |ImageNet|ResNet-18|config|71.58%|71.66%|log| |ImageNet|ResNet-50|config|77.08%|77.22%|log|

  • ResNets

    sh tools/dist_train.sh 8 configs/strategies/DyRep/resnet.yaml resnet18 --dyrep --experiment dyrep_imagenet_res18
    
  • MobileNetV1

    sh tools/dist_train.sh 8 configs/strategies/DyRep/mbv1.yaml mobilenet_v1 --dyrep --experiment dyrep_imagenet_mbv1
    
  • RepVGG

    • DyRep-A2
      sh tools/dist_train.sh 8 configs/strategies/DyRep/repvgg_baseline.yaml timm_repvgg_a2 --dyrep --dyrep_recal_bn_every_epoch --experiment dyrep_imagenet_repvgg_a2
      
    • DyRep-B2g4 and DyRep-B3
      sh tools/dist_train.sh 8 configs/strategies/DyRep/repvgg_strong.yaml timm_repvgg_b2g4 --dyrep --dyrep_recal_bn_every_epoch --experiment dyrep_imagenet_repvgg_b2g4
      

Deploying the Trained DyRep Models to Inference Models

sh tools/dist_run.sh tools/convert.py ${GPUS} ${CONFIG} ${MODEL} --resume ${CHECKPOINT}

For example, if you want to deploy the trained ResNet-50 model with the best checkpoint, run

sh tools/dist_run.sh tools/convert.py 8 configs/strategies/DyRep/resnet.yaml resnet50 --dyrep --resume experiments/dyrep_imagenet_res50/best.pth.tar

Then it will run test before and after deployment to ensure the accuracy will not drop.

The final weights of the inference model will be saved in experiments/dyrep_imagenet_res50/convert/model.ckpt.

License

This project is released under the Apache 2.0 license.

Citation

@InProceedings{Huang_2022_CVPR,
    author    = {Huang, Tao and You, Shan and Zhang, Bohan and Du, Yuxuan and Wang, Fei and Qian, Chen and Xu, Chang},
    title     = {DyRep: Bootstrapping Training With Dynamic Re-Parameterization},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2022},
    pages     = {588-597}
}

Related Skills

View on GitHub
GitHub Stars43
CategoryDevelopment
Updated1y ago
Forks5

Security Score

80/100

Audited on Mar 16, 2025

No findings