DKT
Official implementation of "Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning", ACL2022 main conference
Install / Use
/learn @myt517/DKTREADME
DKT
Official implementation of "Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning", ACL2022 main conference
IND pretrained models
In order to facilitate developers to quickly reproduce the results, we provide our IND pre-trained model. Developers can download it from the link below and put it in the ./pretrain_models directory
Link: https://pan.baidu.com/s/14LwMTnebJjvTgKQhYwycCg Extraction code: 0fj1
Usage
Run the experiments by:
sh scripts/run.sh
You can change the parameters in the script.
Related Skills
proje
Interactive vocabulary learning platform with smart flashcards and spaced repetition for effective language acquisition.
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
research_rules
Research & Verification Rules Quote Verification Protocol Primary Task "Make sure that the quote is relevant to the chapter and so you we want to make sure that we want to have it identifie
