This package contains an implementation of Rational Activation Functions for the machine learning framework TensorFlow, based on the Keras API, which is fully integrated into TensorFlow.
In TensorFlow, you can instantiate a Rational Activation Function by running
from rational.keras import Rational
my_fun = Rational()
This instantiates a Layer
.
If you wish to customize your Rational
instance, feel free to play around with its parameters.
from rational.keras import Rational
my_costum_fun = Rational(approx_func='tanh')
There are two ways to integrate a Rational
instance into a neural network.
You can pass your Rational
instance as the activation
parameter of a layer. In contrast to most other activation
functions, it will of course be learnable.
import tensorflow as tf
from rational.keras import Rational
my_fun = Rational()
model = tf.keras.Sequential([
# layers ...
# rational activation function as a parameter of some other layer (here: Dense)
tf.keras.layers.Dense(100, activation=my_fun)
# more layers ...
])
You can add your Rational
instance as an additional learnable layer to a network.
import tensorflow as tf
from rational.keras import Rational
my_fun = Rational()
model = tf.keras.Sequential([
# layers ...
# rational activation function as layer
my_fun
# more layers ...
])
Please find more documentation on ReadTheDocs.