API - Activations

To make TensorLayer simple, we minimize the number of activation functions as much as we can. So we encourage you to use TensorFlow’s function. TensorFlow provides tf.nn.relu, tf.nn.relu6, tf.nn.elu, tf.nn.softplus, tf.nn.softsign and so on. More TensorFlow official activation functions can be found here. For parametric activation, please read the layer APIs.

The shortcut of tensorlayer.activation is tensorlayer.act.

Your activation

Customizes activation function in TensorLayer is very easy. The following example implements an activation that multiplies its input by 2. For more complex activation, TensorFlow API will be required.

def double_activation(x):
    return x * 2
identity(x[, name]) The identity activation function, Shortcut is linear.
ramp([x, v_min, v_max, name]) The ramp activation function.
leaky_relu([x, alpha, name]) The LeakyReLU, Shortcut is lrelu.
pixel_wise_softmax(output[, name]) Return the softmax outputs of images, every pixels have multiple label, the sum of a pixel is 1.

Identity

tensorlayer.activation.identity(x, name=None)[source]

The identity activation function, Shortcut is linear.

Parameters:
x : a tensor input

input(s)

Returns:
A `Tensor` with the same type as `x`.

Ramp

tensorlayer.activation.ramp(x=None, v_min=0, v_max=1, name=None)[source]

The ramp activation function.

Parameters:
x : a tensor input

input(s)

v_min : float

if input(s) smaller than v_min, change inputs to v_min

v_max : float

if input(s) greater than v_max, change inputs to v_max

name : a string or None

An optional name to attach to this activation function.

Returns:
A `Tensor` with the same type as `x`.

Leaky Relu

tensorlayer.activation.leaky_relu(x=None, alpha=0.1, name='LeakyReLU')[source]

The LeakyReLU, Shortcut is lrelu.

Modified version of ReLU, introducing a nonzero gradient for negative input.

Parameters:
x : A Tensor with type float, double, int32, int64, uint8,

int16, or int8.

alpha : float. slope.
name : a string or None

An optional name to attach to this activation function.

References

Examples

>>> network = tl.layers.DenseLayer(network, n_units=100, name = 'dense_lrelu',
...                 act= lambda x : tl.act.lrelu(x, 0.2))

Pixel-wise Softmax

tensorlayer.activation.pixel_wise_softmax(output, name='pixel_wise_softmax')[source]

Return the softmax outputs of images, every pixels have multiple label, the sum of a pixel is 1. Usually be used for image segmentation.

Parameters:
output : tensor
  • For 2d image, 4D tensor [batch_size, height, weight, channel], channel >= 2.
  • For 3d image, 5D tensor [batch_size, depth, height, weight, channel], channel >= 2.

References

Examples

>>> outputs = pixel_wise_softmax(network.outputs)
>>> dice_loss = 1 - dice_coe(outputs, y_, epsilon=1e-5)

Parametric activation

See tensorlayer.layers.