Keraflow
Deep Learning for Python.
 All Classes Namespaces Functions Pages
Activations

Usage of Activations

Activation function adds non-linearity to each layer's output. To specify the activation function of a layer. You could either appends after the layer an Activation layer

1 model = Sequential([Input(64), Dense(2), Activation('relu')])

or through the activation argument of the layer:

1 model = Sequential([Input(64), Dense(2, activation='relu')])

You can also pass a Theano/TensorFlow function to the actication argument:

1 from keraflow import backend as B
2 
3 def truncated_relu(x):
4  return B.relu(x, max_value=20)
5 
6 model = Sequential([Input(64), Dense(2, activation=truncated_relu)])

or utilize an existing Keraflow activation function:

1 from keraflow import backend as B
2 from keraflow.activations import relu
3 
4 def truncated_relu(x):
5  return relu(x, max_value=20)
6 
7 model = Sequential([Input(64), Dense(2, activation=truncated_relu)])

Available Activations