CorrMatch
[CVPR 2024] Official code for "CorrMatch: Label Propagation via Correlation Matching for Semi-Supervised Semantic Segmentation"
Install / Use
/learn @BBBBchan/CorrMatchREADME
CorrMatch (CVPR 2024)
This repository contains the official implementation of the following paper:
CorrMatch: Label Propagation via Correlation Matching for Semi-Supervised Semantic Segmentation</br> Boyuan Sun, Yuqi Yang, Le Zhang, Ming-Ming Cheng, Qibin Hou</br>
🔥 The Jittor vsersion implementation of CorrMatch is available at Jittor Version !!!
🔥 Our paper is accepted by IEEE Computer Vision and Pattern Recognition (CVPR) 2024 !!!
Overview
CorrMatch provides a solution for mining more high-quality regions from the unlabeled images to leverage the unlabeled data more efficiently for consistency regularization.

Previous approaches mostly employ complicated training strategies to leverage unlabeled data but overlook the role of correlation maps in modeling the relationships between pairs of locations. Thus, we introduce two label propagation strategies (Pixel Propagation and Region Propagation) with the help of correlation maps.
For technical details, please refer to our full paper on arXiv.
Getting Started
Installation
git clone git@github.com:BBBBchan/CorrMatch.git
cd CorrMatch
conda create -n corrmatch python=3.9
conda activate corrmatch
conda install pytorch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 pytorch-cuda=11.7 -c pytorch -c nvidia
pip install opencv-python tqdm einops pyyaml
Pretrained Backbone:
mkdir pretrained
Please put the pretrained model under pretrained dictionary.
Dataset:
- Pascal VOC 2012: JPEGImages | SegmentationClass
- Cityscapes: leftImg8bit | gtFine
Please modify the dataset path in configuration files.The groundtruth mask ids have already been pre-processed. You may use them directly.
Your dataset path may look like:
├── [Your Pascal Path]
├── JPEGImages
└── SegmentationClass
├── [Your Cityscapes Path]
├── leftImg8bit
└── gtFine
Usage
Training CorrMatch
sh tools/train.sh <num_gpu> <port>
To run on different labeled data partitions or different datasets, please modify:
config, labeled_id_path, unlabeled_id_path, and save_path in train.sh.
Evaluation
sh tools/val.sh <num_gpu> <port>
To evaluate your checkpoint, please modify checkpoint_path in val.sh.
Results
Pascal VOC 2012
Labeled images are sampled from the original high-quality training set. Results are obtained by DeepLabv3+ based on ResNet-101 with training size 321(513).
| Method | 1/16 (92) | 1/8 (183) | 1/4 (366) | 1/2 (732) | Full (1464) | |:--------------------:|:---------:|:---------:|:--------------:|:---------:|:-----------:| | SupOnly | 45.1 | 55.3 | 64.8 | 69.7 | 73.5 | | ST++ | 65.2 | 71.0 | 74.6 | 77.3 | 79.1 | | PS-MT | 65.8 | 69.6 | 76.6 | 78.4 | 80.0 | | UniMatch | 75.2 | 77.2 | 78.8 | 79.9 | 81.2 | | CorrMatch (Ours) | 76.4 | 78.5 | 79.4 | 80.6 | 81.8 |
Cityscapes
Results are obtained by DeepLabv3+ based on ResNet-101.
| Method | 1/16 (186) | 1/8 (372) | 1/4 (744) | 1/2 (1488) | |:--------------------:|:----------:|:---------:|:-----------:|:----------:| | SupOnly | 65.7 | 72.5 | 74.4 | 77.8 | | UniMatch | 76.6 | 77.9 | 79.2 | 79.5 | | CorrMatch (Ours) | 77.3 | 78.5 | 79.4 | 80.4 |
Citation
If you find our repo useful for your research, please consider citing our paper:
@article{sun2023corrmatch,
title={CorrMatch: Label Propagation via Correlation Matching for Semi-Supervised Semantic Segmentation},
author={Sun, Boyuan and Yang, Yuqi and Zhang, Le and Cheng, Ming-Ming and Hou, Qibin},
journal={IEEE Computer Vision and Pattern Recognition (CVPR)},
year={2024}
}
License
This code is licensed under the Creative Commons Attribution-NonCommercial 4.0 International for non-commercial use only. Please note that any commercial use of this code requires formal permission prior to use.
Contact
For technical questions, please contact sbysbysby123[AT]gmail.com.
For commercial licensing, please contact cmm[AT]nankai.edu.cn or andrewhoux@gmail.com.
Acknowledgement
We thank UniMatch, CPS, CutMix-Seg, DeepLabv3Plus, U<sup>2</sup>PL and other excellent works (see this project) for their amazing projects!
