API - Activations

To make TensorLayer simple, we minimize the number of activation functions as much as we can. So we encourage you to use TensorFlow’s function. TensorFlow provides tf.nn.relu, tf.nn.relu6, tf.nn.elu, tf.nn.softplus, tf.nn.softsign and so on. More TensorFlow official activation functions can be found here.

Creating custom activation

To implement a custom activation function in TensorLayer is very easy.

The following is an example implementation of an activation that multiplies its input by 2. For more complex activation, TensorFlow API will be required.

def double_activation(x):
    return x * 2
identity(x) The identity activation function
ramp([x, v_min, v_max, name]) The ramp activation function.

Activation functions

tensorlayer.activation.identity(x)[source]

The identity activation function

Parameters:
x : a tensor input

input(s)

tensorlayer.activation.ramp(x=None, v_min=0, v_max=1, name=None)[source]

The ramp activation function.

Parameters:
x : a tensor input

input(s)

v_min : float

if input(s) smaller than v_min, change inputs to v_min

v_max : float

if input(s) greater than v_max, change inputs to v_max

name : a string or None

An optional name to attach to this activation function.