Keraflow
Deep Learning for Python.
 All Classes Namespaces Functions Pages
keraflow.optimizers.Adamax Class Reference

Adamax optimizer from Adam paper's Section 7. More...

Inheritance diagram for keraflow.optimizers.Adamax:
keraflow.optimizers.Optimizer

Public Member Functions

def __init__
 Default parameters follow those provided in the paper. More...
 
- Public Member Functions inherited from keraflow.optimizers.Optimizer
def __init__
 

Detailed Description

Adamax optimizer from Adam paper's Section 7.

It is a variant of Adam based on the infinity norm.

Constructor & Destructor Documentation

def keraflow.optimizers.Adamax.__init__ (   self,
  lr = 0.002,
  beta_1 = 0.9,
  beta_2 = 0.999,
  epsilon = 1e-8,
  kwargs 
)

Default parameters follow those provided in the paper.

Parameters
lrfloat >= 0. Learning rate.
beta_1/beta_2floats, 0 < beta < 1. Generally close to 1.
epsilonfloat >= 0. Fuzz factor.
kwargssee Optimizer.__init__