SkillAgentSearch skills...

ActTensor

ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani

Install / Use

/learn @pouyaardehkhani/ActTensor

README

<div align="center"> <img src="https://github.com/pouyaardehkhani/ActTensor/raw/master/images/ActTensor%20logo.png"><br> </div>

ActTensor: Activation Functions for TensorFlow

license releases

What is it?

ActTensor is a Python package that provides state-of-the-art activation functions which facilitate using them in Deep Learning projects in an easy and fast manner.

Why not using tf.keras.activations?

As you may know, TensorFlow only has a few defined activation functions and most importantly it does not include newly-introduced activation functions. Wrting another one requires time and energy; however, this package has most of the widely-used, and even state-of-the-art activation functions that are ready to use in your models.

Requirements

Install the required dependencies by running the following command:

  • conda env create -f environment.yml

Where to get it?

The source code is currently hosted on GitHub at: https://github.com/pouyaardehkhani/ActTensor

Binary installers for the latest released version are available at the Python Package Index (PyPI)

# PyPI
pip install ActTensor-tf

License

MIT

How to use?

import tensorflow as tf
import numpy as np
from ActTensor_tf import ReLU # name of the layer

functional api

inputs = tf.keras.layers.Input(shape=(28,28))
x = tf.keras.layers.Flatten()(inputs)
x = tf.keras.layers.Dense(128)(x)
# wanted class name
x = ReLU()(x)
output = tf.keras.layers.Dense(10,activation='softmax')(x)

model = tf.keras.models.Model(inputs = inputs,outputs=output)

sequential api

model = tf.keras.models.Sequential([tf.keras.layers.Flatten(),
                                    tf.keras.layers.Dense(128),
                                    # wanted class name
                                    ReLU(),
                                    tf.keras.layers.Dense(10, activation = tf.nn.softmax)])

NOTE:

The main functions of the activation layers are also available, but they may be defined by different names. Check this for more information.

from ActTensor_tf import relu

Activations

Classes and Functions are available in ActTensor_tf

| Activation Name | Class Name | Function Name | |:-------------------------:| :---: | :---: | | SoftShrink | SoftShrink | softSHRINK | | HardShrink | HardShrink | hard_shrink | | GLU | GLU | - | | Bilinear | Bilinear | - | | ReGLU | ReGLU | - | | GeGLU | GeGLU | - | | SwiGLU | SwiGLU | - | | SeGLU | SeGLU | - | | ReLU | ReLU | relu | | Identity | Identity | identity | | Step | Step | step | | Sigmoid | Sigmoid | sigmoid | | HardSigmoid | HardSigmoid | hard_sigmoid | | LogSigmoid | LogSigmoid | log_sigmoid | | SiLU | SiLU | silu | | PLinear | ParametricLinear | parametric_linear | | Piecewise-Linear | PiecewiseLinear | piecewise_linear | | Complementary Log-Log | CLL | cll | | Bipolar | Bipolar | bipolar | | Bipolar-Sigmoid | BipolarSigmoid | bipolar_sigmoid | | Tanh | Tanh | tanh | | TanhShrink | TanhShrink | tanhshrink | | LeCun's Tanh | LeCunTanh | leCun_tanh | | HardTanh | HardTanh | hard_tanh | | TanhExp | TanhExp | tanh_exp | | Absolute | ABS | Abs | | Squared-ReLU | SquaredReLU | [squared_relu](https://github.com/pouyaardehkhani/ActTensor/blob/f

View on GitHub
GitHub Stars27
CategoryData
Updated1mo ago
Forks4

Languages

Python

Security Score

95/100

Audited on Feb 13, 2026

No findings