Dassl.pytorch
A PyTorch toolbox for domain generalization, domain adaptation and semi-supervised learning.
Install / Use
/learn @KaiyangZhou/Dassl.pytorchREADME
Dassl
Introduction
Dassl is a PyTorch toolbox initially developed for our project Domain Adaptive Ensemble Learning (DAEL) to support research in domain adaptation and generalization---since in DAEL we study how to unify these two problems in a single learning framework. Given that domain adaptation is closely related to semi-supervised learning---both study how to exploit unlabeled data---we also incorporate components that support research for the latter.
Why the name "Dassl"? Dassl combines the initials of domain adaptation (DA) and semi-supervised learning (SSL), which sounds natural and informative.
Dassl has a modular design and unified interfaces, allowing fast prototyping and experimentation of new DA/DG/SSL methods. With Dassl, a new method can be implemented with only a few lines of code. Don't believe? Take a look at the engine folder, which contains the implementations of many existing methods (then you will come back and star this repo). :-)
Basically, Dassl is perfect for doing research in the following areas:
- Domain adaptation
- Domain generalization
- Semi-supervised learning
BUT, thanks to the neat design, Dassl can also be used as a codebase to develop any deep learning projects, like this. :-)
A drawback of Dassl is that it doesn't (yet? hmm) support distributed multi-GPU training (Dassl uses DataParallel to wrap a model, which is less efficient than DistributedDataParallel).
We don't provide detailed documentations for Dassl, unlike another project of ours. This is because Dassl is developed for research purpose and as a researcher, we think it's important to be able to read source code and we highly encourage you to do so---definitely not because we are lazy. :-)
What's new
- [Oct 2022] New paper "On-Device Domain Generalization" is out! Code, models and datasets: https://github.com/KaiyangZhou/on-device-dg.
- [Jun 2022]
v0.6.0: Makecfg.TRAINER.METHOD_NAMEconsistent with the method class name. - [Jun 2022] A new domain adaptation method CDAC (CVPR'21) is added by Shreejal Trivedi. See here for more details.
- [Jun 2022] Adds three datasets from the WILDS benchmark: iWildCam, FMoW and Camelyon17. See here for more details.
- [May 2022] A new domain generalization method DDG developed by Zhishu Sun and to appear at IJCAI'22 is added to this repo. See here for more details.
- [Mar 2022] A new domain generalization method EFDM developed by Yabin Zhang (PolyU) and to appear at CVPR'22 is added to this repo. See here for more details.
- [Feb 2022] In case you don't know, a class in the painting domain of DomainNet (the official splits) only has test images (no training images), which could affect performance. See section 4.a in our paper for more details.
- [Oct 2021]
v0.5.0: Important changes made totransforms.py. 1)center_cropbecomes a default transform in testing (applied after resizing the smaller edge to a certain size to keep the image aspect ratio). 2) For training,Resize(cfg.INPUT.SIZE)is deactivated whenrandom_croporrandom_resized_cropis used. These changes won't make any difference to the training transforms used in existing config files, nor to the testing transforms unless the raw images are not squared (the only difference is that now the image aspect ratio is respected). - [Oct 2021]
v0.4.3: Copy the attributes inself.dm(data manager) toSimpleTrainerand makeself.dmoptional, which means from now on, you can build data loaders from any source you like rather than being forced to useDataManager. - [Sep 2021]
v0.4.2: An important update is to setdrop_last=is_train and len(data_source)>=batch_sizewhen constructing a data loader to avoid 0-length.
Overview
Dassl has implemented the following methods:
-
Single-source domain adaptation
- Cross Domain Adaptive Clustering for Semi Supervised Domain Adaptation (CVPR'21) [dassl/engine/da/cdac.py]
- Semi-supervised Domain Adaptation via Minimax Entropy (ICCV'19) [dassl/engine/da/mme.py]
- Maximum Classifier Discrepancy for Unsupervised Domain Adaptation (CVPR'18) [dassl/engine/da/mcd.py]
- Self-ensembling for visual domain adaptation (ICLR'18) [dassl/engine/da/self_ensembling.py]
- Revisiting Batch Normalization For Practical Domain Adaptation (ICLR-W'17) [dassl/engine/da/adabn.py]
- Adversarial Discriminative Domain Adaptation (CVPR'17) [dassl/engine/da/adda.py]
- Domain-Adversarial Training of Neural Networks (JMLR'16) [dassl/engine/da/dann.py]
-
Multi-source domain adaptation
-
Domain generalization
- Dynamic Domain Generalization (IJCAI'22) [dassl/modeling/backbone/resnet_dynamic.py] [dassl/engine/dg/domain_mix.py]
- Exact Feature Distribution Matching for Arbitrary Style Transfer and Domain Generalization (CVPR'22) [dassl/modeling/ops/efdmix.py]
- Domain Generalization with MixStyle (ICLR'21) [dassl/modeling/ops/mixstyle.py]
- Deep Domain-Adversarial Image Generation for Domain Generalisation (AAAI'20) [dassl/engine/dg/ddaig.py]
- Generalizing Across Domains via Cross-Gradient Training (ICLR'18) [dassl/engine/dg/crossgrad.py]
-
Semi-supervised learning
- FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence [dassl/engine/ssl/fixmatch.py]
- MixMatch: A Holistic Approach to Semi-Supervised Learning (NeurIPS'19) [dassl/engine/ssl/mixmatch.py]
- Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results (NeurIPS'17) [dassl/engine/ssl/mean_teacher.py]
- Semi-supervised Learning by Entropy Minimization (NeurIPS'04) [dassl/engine/ssl/entmin.py]
Feel free to make a PR to add your methods here to make it easier for others to benchmark!
Dassl supports the following datasets:
-
Domain adaptation
-
Domain generalization
-
Semi-supervised learning
Get started
Installation
Make sure conda is installed properly.
# Clone this repo
git clone https://github.com/
