SkillAgentSearch skills...

FastActivations.jl

A collection of activation function approximations for Flux.

Install / Use

/learn @NTimmons/FastActivations.jl
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

FastActivations.jl

A collection of activation function approximations for Flux.

In some models the accuracy of the sigmoid and tanh functions can be reduced without a loss of accuracy in the training process. Switching to an approximation can reduce training times significantly for some models.

Sigmoid Approximations

For sigmoid we provide fitted approximations using Taylor and Pade curve fit models as well as an implementation which uses a fast exp imeplemention based on an approximation of the formula: exp(x) = lim<sub>n->inf</sub> (1 + x/n)

| Fitted Functions | Fast Expr | | ----------------- | ------------- | | alt text | alt text | | alt text | |

Theano Sigmoid

There is also an implementation of TheanoFastSigmoid which is currently accepted in the Theano project. It is here mostly for comparison because it is both slower and less accurate than other Sigmoid approximations.

Tanh Approximations

For tanh we provide fitted approximations using Taylor and Pade curve fit models as well as an implementation based on the continuous fraction approximation of tanh.

alt text

Additionally we also provide the serpentine function.

| Fitted Functions | Continuous Fraction | Serpentine | ----------------- | ------------- |--------------| |alt text | alt text | alt text | | alt text | alt text | |

View on GitHub
GitHub Stars8
CategoryDevelopment
Updated1y ago
Forks2

Languages

Julia

Security Score

55/100

Audited on Jun 22, 2024

No findings