SkillAgentSearch skills...

ResNeSt

PyTorch implementation of ResNeSt : Split-Attention Networks

Install / Use

/learn @STomoya/ResNeSt
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

ResNeSt

PyTorch implementation of ResNeSt : Split-Attention Networks [1].

This implementation is only for my understanding of the architecture of ResNeSt.
Mostly the radix-major implementation of the bottleneck block.

The official implementation

Requirements

  • docker
  • docker-compose

Model

  • Only supports dilation=1.

ToDo

  • Evaluate the model

Reference

[1] ResNeSt : Split-Attention Networks, Hang Zhang, Chongruo Wu, Zhongyue Zhang, Yi Zhu, Zhi Zhang, Haibin Lin, Yue Sun, Tong He, Jonas Mueller, R. Manmatha, Mu Li, Alexander Smola, https://arxiv.org/abs/2004.08955

Author

Sawada Tomoya

Related Skills

View on GitHub
GitHub Stars18
CategoryDevelopment
Updated1y ago
Forks3

Languages

Python

Security Score

75/100

Audited on Dec 28, 2024

No findings