12 skills found
wu-dd / Advances In Partial And Complementary Label LearningA curated list of most recent papers & codes in Learning with Partial/Complementary Labels
NJUyued / MutexMatch4SSL"MutexMatch: Semi-Supervised Learning with Mutex-Based Consistency Regularization" by Yue Duan (TNNLS)
takashiishida / Comp[NeurIPS 2017] [ICML 2019] Code for complementary-label learning
ntucllab / LibcllComplementary-label learning in Pytorch
Robbie-Xu / CPSDPyTorch implementation of Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation, IJCAI 2022.
ntucllab / CLImage DatasetThe dataset repo of "CLCIFAR: CIFAR-Derived Benchmark Datasets with Human Annotated Complementary Labels" paper
RoyalSkye / ATCL[NeurIPS 2022] "Adversarial Training with Complementary Labels: On the Benefit of Gradually Informative Attacks"
wwangwitsel / SCARCE[ICML 2024] Learning with Complementary Labels Revisited: The Selected-Completely-at-Random Setting Is More Practical
xhweei / Hypergraph Learning Based Discriminative Band SelectionFor hyperspectral images (HSIs), it is a challenging task to select discriminative bands due to the lack of labeled samples and complex noise. In this article, we present a novel local-view-assisted discriminative band selection method with hypergraph autolearning (LvaHAl) to solve these problems from both local and global perspectives. Specifically, the whole band space is first randomly divided into several subspaces (LVs) of different dimensions, where each LV denotes a set of lowdimensional representations of training samples consisting of bands associated with it. Then, for different LVs, a robust hinge loss function for isolated pixels regularized by the row-sparsity is adopted to measure the importance of the corresponding bands. In order to simultaneously reduce the bias of LVs and encode the complementary information between them, samples from all LVs are further projected into the label space. Subsequently, a hypergraph model that automatically learns the hyperedge weights is presented. In this way, the local manifold structure of these projections can be preserved, ensuring that samples of the same class have a small distance. Finally, a consensus matrix is used to integrate the importance of bands corresponding to different LVs, resulting in the optimal selection of expected bands from a global perspective. The classification experiments on three HSI data sets show that our method is competitive with other comparison methods
gaoyi439 / Complementary Label Learning[ICML'21] Discriminative Complementary-Label Learning withWeighted Loss
gaoyi439 / GDF[IJCAI'23] Unbiased Risk Estimator to Multi-Labeled Complementary Label Learning
yhli-ml / PLNLOfficial PyTorch Implementation for the ICLR 25' paper - Complementary Label Learning with Positive Label Guessing and Negative Label Enhancement.