Keraflow
Deep Learning for Python.
 All Classes Namespaces Functions Pages
keraflow.layers.recurrent.LSTM Class Reference

Long-Short Term Memory unit - Hochreiter 1997. More...

Inheritance diagram for keraflow.layers.recurrent.LSTM:
keraflow.layers.recurrent.Recurrent

Public Member Functions

def __init__
 
- Public Member Functions inherited from keraflow.layers.recurrent.Recurrent
def __init__
 

Detailed Description

Long-Short Term Memory unit - Hochreiter 1997.

For a step-by-step description of the algorithm, see this tutorial.

Constructor & Destructor Documentation

def keraflow.layers.recurrent.LSTM.__init__ (   self,
  output_dim,
  init = 'glorot_uniform',
  inner_init = 'orthogonal',
  forget_bias_init = 'one',
  activation = 'tanh',
  inner_activation = 'hard_sigmoid',
  dropout_W = 0.,
  dropout_U = 0.,
  return_sequences = False,
  go_backwards = False,
  stateful = False,
  unroll = False,
  kwargs 
)
Parameters
output_dimint. The output dimension of the layer.
initstr/function. Function to initialize W_i,W_f,W_c, W_o (input to hidden transformation). See Initializations.
inner_initstr/function. Function to initialize U_i,U_f,U_c, U_o (hidden to hidden transformation). See Initializations.
forget_bias_initinitialization function for the bias of the forget gate. Jozefowicz et al. recommend initializing with ones.
activationstr/function. Activation function applied on the output. See Activations.
inner_activationstr/function. Activation function applied on the inner cells. See Activations.
dropout_Wfloat between 0 and 1. Fraction of the input units to drop for input gates.
dropout_Ufloat between 0 and 1. Fraction of the input units to drop for recurrent connections.
return_sequencesBoolean. See Recurrent.__init__
go_backwardsBoolean. See Recurrent.__init__
statefulBoolean. See Recurrent.__init__
unrollBoolean. See Recurrent.__init__
kwargssee Layer.__init__.