Nanodet
NanoDet-Plus⚡Super fast and lightweight anchor-free object detection model. 🔥Only 980 KB(int8) / 1.8MB (fp16) and run 97FPS on cellphone🔥
Install / Use
/learn @RangiLyu/NanodetREADME
NanoDet-Plus
Super fast and high accuracy lightweight anchor-free object detection model. Real-time on mobile devices.
</div>- ⚡Super lightweight: Model file is only 980KB(INT8) or 1.8MB(FP16).
- ⚡Super fast: 97fps(10.23ms) on mobile ARM CPU.
- 👍High accuracy: Up to 34.3 mAP<sup>val</sup>@0.5:0.95 and still realtime on CPU.
- 🤗Training friendly: Much lower GPU memory cost than other models. Batch-size=80 is available on GTX1060 6G.
- 😎Easy to deploy: Support various backends including ncnn, MNN and OpenVINO. Also provide Android demo based on ncnn inference framework.
Introduction

NanoDet is a FCOS-style one-stage anchor-free object detection model which using Generalized Focal Loss as classification and regression loss.
In NanoDet-Plus, we propose a novel label assignment strategy with a simple assign guidance module (AGM) and a dynamic soft label assigner (DSLA) to solve the optimal label assignment problem in lightweight model training. We also introduce a light feature pyramid called Ghost-PAN to enhance multi-layer feature fusion. These improvements boost previous NanoDet's detection accuracy by 7 mAP on COCO dataset.
QQ交流群:908606542 (答案:炼丹)
Benchmarks
Model |Resolution| mAP<sup>val<br>0.5:0.95 |CPU Latency<sup><br>(i7-8700) |ARM Latency<sup><br>(4xA76) | FLOPS | Params | Model Size :-------------:|:--------:|:-------:|:--------------------:|:--------------------:|:----------:|:---------:|:-------: NanoDet-m | 320320 | 20.6 | 4.98ms | 10.23ms | 0.72G | 0.95M | 1.8MB(FP16) | 980KB(INT8) NanoDet-Plus-m | 320320 | 27.0 | 5.25ms | 11.97ms | 0.9G | 1.17M | 2.3MB(FP16) | 1.2MB(INT8) NanoDet-Plus-m | 416416 | 30.4 | 8.32ms | 19.77ms | 1.52G | 1.17M | 2.3MB(FP16) | 1.2MB(INT8) NanoDet-Plus-m-1.5x | 320320 | 29.9 | 7.21ms | 15.90ms | 1.75G | 2.44M | 4.7MB(FP16) | 2.3MB(INT8) NanoDet-Plus-m-1.5x | 416416 | 34.1 | 11.50ms | 25.49ms | 2.97G | 2.44M | 4.7MB(FP16) | 2.3MB(INT8) YOLOv3-Tiny | 416416 | 16.6 | - | 37.6ms | 5.62G | 8.86M | 33.7MB YOLOv4-Tiny | 416416 | 21.7 | - | 32.81ms | 6.96G | 6.06M | 23.0MB YOLOX-Nano | 416416 | 25.8 | - | 23.08ms | 1.08G | 0.91M | 1.8MB(FP16) YOLOv5-n | 640640 | 28.4 | - | 44.39ms | 4.5G | 1.9M | 3.8MB(FP16) FBNetV5 | 320640 | 30.4 | - | - | 1.8G | - | - MobileDet | 320*320 | 25.6 | - | - | 0.9G | - | -
Download pre-trained models and find more models in Model Zoo or in Release Files
<details> <summary>Notes (click to expand)</summary>-
ARM Performance is measured on Kirin 980(4xA76+4xA55) ARM CPU based on ncnn. You can test latency on your phone with ncnn_android_benchmark.
-
Intel CPU Performance is measured Intel Core-i7-8700 based on OpenVINO.
-
NanoDet mAP(0.5:0.95) is validated on COCO val2017 dataset with no testing time augmentation.
-
YOLOv3&YOLOv4 mAP refers from Scaled-YOLOv4: Scaling Cross Stage Partial Network.
NEWS!!!
-
[2023.01.20] Upgrade to pytorch-lightning-1.9. The minimum PyTorch version is upgraded to 1.10. Support FP16 training(Thanks @crisp-snakey). Support ignore label(Thanks @zero0kiriyu).
-
[2022.08.26] Upgrade to pytorch-lightning-1.7. The minimum PyTorch version is upgraded to 1.9. To use previous version of PyTorch, please install NanoDet <= v1.0.0-alpha-1
-
[2021.12.25] NanoDet-Plus release! Adding AGM(Assign Guidance Module) & DSLA(Dynamic Soft Label Assigner) to improve 7 mAP with only a little cost.
Find more update notes in Update notes.
Demo
Android demo

Android demo project is in demo_android_ncnn folder. Please refer to Android demo guide.
Here is a better implementation 👉 ncnn-android-nanodet
NCNN C++ demo
C++ demo based on ncnn is in demo_ncnn folder. Please refer to Cpp demo guide.
MNN demo
Inference using Alibaba's MNN framework is in demo_mnn folder. Please refer to MNN demo guide.
OpenVINO demo
Inference using OpenVINO is in demo_openvino folder. Please refer to OpenVINO demo guide.
Web browser demo
https://nihui.github.io/ncnn-webassembly-nanodet/
Pytorch demo
First, install requirements and setup NanoDet following installation guide. Then download COCO pretrain weight from here
The pre-trained weight was trained by the config config/nanodet-plus-m_416.yml.
- Inference images
python demo/demo.py image --config CONFIG_PATH --model MODEL_PATH --path IMAGE_PATH
- Inference video
python demo/demo.py video --config CONFIG_PATH --model MODEL_PATH --path VIDEO_PATH
- Inference webcam
python demo/demo.py webcam --config CONFIG_PATH --model MODEL_PATH --camid YOUR_CAMERA_ID
Besides, We provide a notebook here to demonstrate how to make it work with PyTorch.
Install
Requirements
- Linux or MacOS
- CUDA >= 10.2
- Python >= 3.7
- Pytorch >= 1.10.0, <2.0.0
Step
- Create a conda virtual environment and then activate it.
conda create -n nanodet python=3.8 -y
conda activate nanodet
- Install pytorch
conda install pytorch torchvision cudatoolkit=11.1 -c pytorch -c conda-forge
- Clone this repository
git clone https://github.com/RangiLyu/nanodet.git
cd nanodet
- Install requirements
pip install -r requirements.txt
- Setup NanoDet
python setup.py develop
Model Zoo
NanoDet supports variety of backbones. Go to the config folder to see the sample training config files.
Model | Backbone |Resolution|COCO mAP| FLOPS |Params | Pre-train weight | :--------------------:|:------------------:|:--------:|:------:|:-----:|:-----:|:-----:| NanoDet-m | ShuffleNetV2 1.0x | 320320 | 20.6 | 0.72G | 0.95M | Download | NanoDet-Plus-m-320 (NEW) | ShuffleNetV2 1.0x | 320320 | 27.0 | 0.9G | 1.17M | Weight | Checkpoint NanoDet-Plus-m-416 (NEW) | ShuffleNetV2 1.0x | 416416 | 30.4 | 1.52G | 1.17M | Weight | Checkpoint NanoDet-Plus-m-1.5x-320 (NEW)| ShuffleNetV2 1.5x | 320320 | 29.9 | 1.75G | 2.44M | Weight | Checkpoint NanoDet-Plus-m-1.5x-416 (NEW)| ShuffleNetV2 1.5x | 416*416 | 34.1 | 2.97G | 2.44M | Weight | Checkpoint
Notice: The difference between Weight and Checkpoint is the weight only provide params in inference time, but the checkpoint contains training time params.
Legacy Model Zoo
Model | Backbone |Resolution|COCO mAP| FLOPS |Params | Pre-train weight | :--------------------:|:------------------:|:--------:|:------:|:-----:|:-----:|:-----:| NanoDet-m-416 | Shuffle
