PyCIL
PyCIL: A Python Toolbox for Class-Incremental Learning
Install / Use
/learn @LAMDA-CL/PyCILREADME
PyCIL: A Python Toolbox for Class-Incremental Learning
<p align="center"> <a href="#Introduction">Introduction</a> • <a href="#Methods-Reproduced">Methods Reproduced</a> • <a href="#Reproduced-Results">Reproduced Results</a> • <a href="#how-to-use">How To Use</a> • <a href="#license">License</a> • <a href="#Acknowledgments">Acknowledgments</a> • <a href="#Contact">Contact</a> </p> <div align="center"> <img src="./resources/logo_v2.png" width="800px"> </div>
<div align="center"> </div>
Welcome to PyCIL, perhaps the toolbox for class-incremental learning with the most implemented methods. This is the code repository for "PyCIL: A Python Toolbox for Class-Incremental Learning" [paper] in PyTorch. If you use any content of this repo for your work, please cite the following bib entries:
@article{zhou2023pycil,
author = {Da-Wei Zhou and Fu-Yun Wang and Han-Jia Ye and De-Chuan Zhan},
title = {PyCIL: a Python toolbox for class-incremental learning},
journal = {SCIENCE CHINA Information Sciences},
year = {2023},
volume = {66},
number = {9},
pages = {197101},
doi = {https://doi.org/10.1007/s11432-022-3600-y}
}
@article{zhou2024class,
author = {Zhou, Da-Wei and Wang, Qi-Wei and Qi, Zhi-Hong and Ye, Han-Jia and Zhan, De-Chuan and Liu, Ziwei},
title = {Class-Incremental Learning: A Survey},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
volume={46},
number={12},
pages={9851--9873},
year = {2024}
}
@inproceedings{zhou2024continual,
title={Continual learning with pre-trained models: A survey},
author={Zhou, Da-Wei and Sun, Hai-Long and Ning, Jingyi and Ye, Han-Jia and Zhan, De-Chuan},
booktitle={IJCAI},
pages={8363-8371},
year={2024}
}
What's New
- [2026-01]🌟 We have released C3Box, a CLIP-based Class-Incremental Learning Toolbox. Have a try!
- [2025-07]🌟 Check out our latest work on class-incremental learning with CLIP (ICCV 2025)!
- [2025-07]🌟 Check out our latest work on pre-trained model-based class-incremental learning (ICCV 2025)!
- [2025-07]🌟 Check out our latest work on domain-incremental learning with PTM (ICML 2025)!
- [2025-04]🌟 Add TagFex. State-of-the-art method of 2025!
- [2025-03]🌟 Check out our latest work on class-incremental learning (CVPR 2025)!
- [2025-02]🌟 Check out our latest work on pre-trained model-based domain-incremental learning (CVPR 2025)!
- [2025-02]🌟 Check out our latest work on class-incremental learning with vision-language models (TPAMI 2025)!
- [2024-12]🌟 Check out our latest work on pre-trained model-based class-incremental learning (AAAI 2025)!
- [2024-08]🌟 Check out our latest work on pre-trained model-based class-incremental learning (IJCV 2024)!
- [2024-07]🌟 Check out our rigorous and unified survey about class-incremental learning, which introduces some memory-agnostic measures with holistic evaluations from multiple aspects (TPAMI 2024)!
- [2024-06]🌟 Check out our work about all-layer margin in class-incremental learning (ICML 2024)!
- [2024-03]🌟 Check out our latest work on pre-trained model-based class-incremental learning (CVPR 2024)!
- [2024-01]🌟 Check out our latest survey on pre-trained model-based continual learning (IJCAI 2024)!
- [2023-09]🌟 We have released PILOT toolbox for class-incremental learning with pre-trained models. Have a try!
- [2023-07]🌟 Add MEMO, BEEF, and SimpleCIL. State-of-the-art methods of 2023!
- [2022-12]🌟 Add FrTrIL, PASS, IL2A, and SSRE.
- [2022-10]🌟 PyCIL has been published in SCIENCE CHINA Information Sciences. Check out the official introduction!
- [2022-08]🌟 Add RMM.
- [2022-07]🌟 Add FOSTER. State-of-the-art method with a single backbone!
- [2021-12]🌟 Call For Feedback: We add a <a href="#Awesome-Papers-using-PyCIL">section</a> to introduce awesome works using PyCIL. If you are using PyCIL to publish your work in top-tier conferences/journals, feel free to contact us for details!
- [2021-12]🌟 As team members are committed to other projects and in light of the intense demands of code reviews, we will prioritize reviewing algorithms that have explicitly cited and implemented methods from our toolbox paper in their publications. Please read the PR policy before submitting your code.
Introduction
Traditional machine learning systems are deployed under the closed-world setting, which requires the entire training data before the offline training process. However, real-world applications often face the incoming new classes, and a model should incorporate them continually. The learning paradigm is called Class-Incremental Learning (CIL). We propose a Python toolbox that implements several key algorithms for class-incremental learning to ease the burden of researchers in the machine learning community. The toolbox contains implementations of a number of founding works of CIL, such as EWC and iCaRL, but also provides current state-of-the-art algorithms that can be used for conducting novel fundamental research. This toolbox, named PyCIL for Python Class-Incremental Learning, is open source with an MIT license.
For more information about incremental learning, you can refer to these reading materials:
- A brief introduction (in Chinese) about CIL is available here.
- A PyTorch Tutorial to Class-Incremental Learning (with explicit codes and detailed explanations) is available here.
Methods Reproduced
FineTune: Baseline method which simply updates parameters on new tasks.EWC: Overcoming catastrophic forgetting in neural networks. PNAS2017 [paper]LwF: Learning without Forgetting. ECCV2016 [paper]Replay: Baseline method with exemplar replay.GEM: Gradient Episodic Memory for Continual Learning. NIPS2017 [paper]iCaRL: Incremental Classifier and Representation Learning. CVPR2017 [paper]BiC: Large Scale Incremental Learning. CVPR2019 [paper]WA: Maintaining Discrimination and Fairness in Class Incremental Learning. CVPR2020 [paper]PODNet: PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning. ECCV2020 [paper]DER: DER: Dynamically Expandable Representation for Class Incremental Learning. CVPR2021 [paper]PASS: Prototype Augmentation and Self-Supervision for Incremental Learning. CVPR2021 [paper]RMM: RMM: Reinforced Memory Management for Class-Incremental Learning. NeurIPS2021 [paper]IL2A: Class-Incremental Learning via Dual Augmentation. NeurIPS2021 [paper]ACIL: Analytic Class-Incremental Learning with Absolute Memorization and Privacy Protection. NeurIPS 2022 [paper]SSRE: Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning. CVPR2022 [paper]FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning. WACV2023 [paper]Coil: Co-Transport for Class-Incremental Learning. ACM MM2021 [paper]FOSTER: Feature Boosting and Compression for Class-incremental Learning. ECCV 2022 [paper]MEMO: A Model or 603 Exemplars: Towards Memory-Efficient Class-Incremental Learning. ICLR 2023 Spotlight [paper]
Related Skills
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
openclaw-plugin-loom
Loom Learning Graph Skill This skill guides agents on how to use the Loom plugin to build and expand a learning graph over time. Purpose - Help users navigate learning paths (e.g., Nix, German)
Leadership-Mirror
Product Overview Project Purpose Hack Atria is a leadership development and team management platform that provides AI-powered insights, feedback analysis, and learning resources to help leaders
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
