49 skills found · Page 1 of 2
cxdzyq1110 / NPU On FPGA在FPGA上面实现一个NPU计算单元。能够执行矩阵运算(ADD/ADDi/ADDs/MULT/MULTi/DOT等)、图像处理运算(CONV/POOL等)、非线性映射(RELU/TANH/SIGM等)。
hhj1897 / Face ParsingOfficial Pytorch implementation of 'RoI Tanh-polar Transformer Network for Face Parsing in the Wild.'
rastikerdar / Tanha Font(Discontinued) - A Persian (Farsi) Font - فونت (قلم) فارسی تنها
yandex / FastopsThis small library enables acceleration of bulk calls of certain math functions on AVX and AVX2 hardware. Currently supported operations are exp, log, sigmoid and tanh. The library is designed with extensibility in mind.
eskender-f / Roi TanhFace Parsing with RoI Tanh-Warping
liyuxuan7762 / TanhuaAPP黑马探花交友项目 探花交友是一个陌生人的在线交友平台,在该平台中可以搜索附近的人,查看好友动态。
tanhauhau / Tanhauhau.github.ioTan Li Hau's Personal Website
lyfGeek / Tanhua Geek探花交友 app。
1995parham-teaching / TanhaDarKhanehLearn to code with C in your home :house: :face_with_thermometer: :mask:
sadrasabouri / CORDICImplementation of CORDIC Algorithms Using Verilog
BadtzMaru / Heima Tanhuajiaoyou黑马学堂_react-native_探花交友APP
sigma-py / Tanh Sinh:triangular_ruler: tanh-sinh quadrature for Python
aod321 / Face Parsing Via Tanh WarpingPyTorch implementations of "Face Parsing via tanh-warping"
HuHaigen / Adaptively Customizing Activation FunctionsTo enhance the nonlinearity of neural networks and increase their mapping abilities between the inputs and response variables, activation functions play a crucial role to model more complex relationships and patterns in the data. In this work, a novel methodology is proposed to adaptively customize activation functions only by adding very few parameters to the traditional activation functions such as Sigmoid, Tanh, and ReLU. To verify the effectiveness of the proposed methodology, some theoretical and experimental analysis on accelerating the convergence and improving the performance is presented, and a series of experiments are conducted based on various network models (such as AlexNet, VGGNet, GoogLeNet, ResNet and DenseNet), and various datasets (such as CIFAR10, CIFAR100, miniImageNet, PASCAL VOC and COCO) . To further verify the validity and suitability in various optimization strategies and usage scenarios, some comparison experiments are also implemented among different optimization strategies (such as SGD, Momentum, AdaGrad, AdaDelta and ADAM) and different recognition tasks like classification and detection. The results show that the proposed methodology is very simple but with significant performance in convergence speed, precision and generalization, and it can surpass other popular methods like ReLU and adaptive functions like Swish in almost all experiments in terms of overall performance.
nitheeshkm / Sigmoid Tanh VerilogVerilog Sigmoid and Tanh functions which can be configured and added to your neural network project
svretina / FastTanhSinhQuadrature.jlFast and high-precision numerical integration using Tanh-Sinh (Double Exponential) quadrature in Julia.
TinSn50 / PINNs Applications In Linear Elastic Solid MechanicsThis project is divided in a two parts. In first study, Lame parameters are identified using tanh activation function. After that, six activation functions are analysed on the basis of minimum loss, training time and convergence order for different error norms.
EPSOFT / Deep LearningDeep Learning
hhj1897 / Roi Tanh WarpingNo description available
alireza-shirzad / Cordic TanhVerilog and matlab implementation of tanh using Cordic algorithm