1,941 skills found · Page 1 of 65
taki0112 / UGATITOfficial Tensorflow implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation (ICLR 2020)
weiaicunzai / Pytorch Cifar100Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)
PetarV- / GATGraph Attention Networks (https://arxiv.org/abs/1710.10903)
lucidrains / Musiclm PytorchImplementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
zhanghang1989 / ResNeStResNeSt: Split-Attention Networks
diegoantognini / PyGATPytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
linto-ai / Whisper TimestampedMultilingual Automatic Speech Recognition with word-level timestamps and confidence
heykeetae / Self Attention GANPytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
znxlwm / UGATIT PytorchOfficial PyTorch implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
CASIA-IVA-Lab / DANetDual Attention Network for Scene Segmentation (CVPR2019)
ozan-oktay / Attention Gated NetworksUse of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
hila-chefer / Transformer Explainability[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
allenai / Bi Att FlowBi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
yulunzhang / RCANPyTorch code for our ECCV 2018 paper "Image Super-Resolution Using Very Deep Residual Channel Attention Networks"
szagoruyko / Attention TransferImproving Convolutional Networks via Attention Transfer (ICLR 2017)
BangguWu / ECANetCode for ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
xiangwang1223 / Knowledge Graph Attention NetworkKGAT: Knowledge Graph Attention Network for Recommendation, KDD2019
cedrickchee / Awesome Transformer NlpA curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning.
richliao / TextClassifierText classifier for Hierarchical Attention Networks for Document Classification
lucidrains / Tab Transformer PytorchImplementation of TabTransformer, attention network for tabular data, in Pytorch