Keraflow
Deep Learning for Python.
 All Classes Namespaces Functions Pages
Class List
Here are the classes, structs, unions and interfaces with brief descriptions:
[detail level 1234]
\Nkeraflow
 oNactivationsBuilt-in activation functions
 oNbackend
 |oNcommon
 ||\CBackend
 |oNtensorflow_backend
 ||\CTensorflowBackend
 |\Ntheano_backend
 | \CTheanoBackend
 oNcallbacks
 |oCCallbackAbstract base class used to build new callbacks
 |oCModelCheckpointSave the model after every epoch
 |oCEarlyStoppingStop training when a monitored quantity has stopped improving
 |\CLearningRateSchedulerLearning rate scheduler
 oNconstraintsBuilt-in constrints
 |oCMaxNormConstrain the weights along an axis pattern to have unit norm
 |oCNonNegConstrain the weights to be non-negative
 |\CUnitNormConstrain the weights along an axis pattern to have unit norm
 oNinitializationsBuilt-in initialization functions
 oNlayers
 |oNbase
 ||oCKensorWrapper for a backend tensor
 ||oCLayerBuilding block of a model
 ||oCMultiInputLayerBase class for layer accepting multiple inputs
 ||oCInputThe entering point of each model
 ||\CSequentialLayerClass for making Sequential a Layer
 |oNconvolution
 ||oCConvolutionBaseBase layer for convolution layers
 ||oCConvolution1DConvolution layer for convolving (sequence_length, input_dim) inputs
 ||oCConvolution2DConvolution layer for convolving (input_depth, input_row, input_col) inputs
 ||oCConvolution3DNot implemented yet
 ||oCPoolingBaseBase layer for pooling layers
 ||oCPooling1DPooling layer for sub-sampling (sequence_length, input_dim) inputs
 ||oCPooling2DPooling layer for sub-sampling (input_depth, input_row, input_col) inputs
 ||oCPooling3DZero-padding layer for (input_depth, input_x, input_y, input_z) inputs
 ||oCZeroPaddingBaseBase layer for zero padding layers
 ||oCZeroPadding1DZero-padding layer for (sequence_length, input_dim) inputs
 ||oCZeroPadding2DZero-padding layer for (input_depth, input_row, input_col) inputs
 ||oCZeroPadding3DZero-padding layer for (input_depth, input_x, input_y, input_z) inputs
 ||oCUnSamplingBaseBase layer for unsampling layers
 ||oCUnSampling1DRepeat each temporal step length times along the time axis
 ||oCUnSampling2DUnsampling layer for (input_depth, input_row, input_col) inputs
 ||\CUnSampling3DUnsampling layer for (input_depth, input_x, input_y, input_z) inputs
 |oNcore
 ||oCExpandDimsExpand dimension of the input tensor
 ||oCPermuteDimsPermutes the dimensions of the input tensor according to a given pattern
 ||oCReshapeReshapes the input tensor according to a given pattern
 ||oCFlattenFlatten the input tensor into 1D
 ||oCRepeatRepeat the input tensor n times along given axis
 ||oCConcatenateConcatenate multiple input tensors
 ||oCLambdaWrapper for implementating simple inline layer
 ||oCActivationApplies an activation function to an output
 ||oCElementWiseSumReduce multiple input tensors by conducting summation operation
 ||oCElementWiseMultReduce multiple input tensors by conducting multiplication operation
 ||oCDropoutApplies Dropout to the input
 ||oCDenseFully connected layer
 ||\CHighwayDensely connected highway network
 |oNembeddings
 ||\CEmbeddingVocabulary (row) vectors looking up layer
 |oNrecurrent
 ||oCRecurrentBase class for recurrent layers
 ||oCSimpleRNNFully-connected RNN where the output is to be fed back to input
 ||oCGRUGated Recurrent Unit - Cho et al
 ||\CLSTMLong-Short Term Memory unit - Hochreiter 1997
 |\Nwrappers
 | \CTimeDistributedWrapper for apply a layer to every temporal slice of an input
 oNmodels
 |oCModelA model is a directed Kensor graph
 |\CSequentialModel with single input and single output
 oNobjectivesBuilt-in objectives functions
 oNoptimizers
 |oCOptimizerAbstract optimizer base class
 |oCSGDStochastic gradient descent, with support for momentum, learning rate decay, and Nesterov momentum
 |oCRMSpropRMSProp optimizer
 |oCAdagradAdagrad optimizer
 |oCAdadeltaAdadelta optimizer
 |oCAdamAdam optimizer
 |\CAdamaxAdamax optimizer from Adam paper's Section 7
 oNregularizersBuilt-in regularizers
 |oCL1L1 weight regularization penalty, also known as LASSO
 |oCL2L2 weight regularization penalty, also known as weight decay, or Ridge
 |\CL1L2L1-L2 weight regularization penalty, also known as ElasticNet
 \Nutils
  oNexceptions
  |\CKeraFlowError
  oNgeneric_utilsGeneric utils for keraflow
  \Nuser_input_utils
   \CUserInputUtility class for flexible user input assiging optimizers, regularizers, numpy input..