Keraflow
Deep Learning for Python.
|
Adamax optimizer from Adam paper's Section 7. More...
Public Member Functions | |
def | __init__ |
Default parameters follow those provided in the paper. More... | |
![]() | |
def | __init__ |
Adamax optimizer from Adam paper's Section 7.
It is a variant of Adam based on the infinity norm.
def keraflow.optimizers.Adamax.__init__ | ( | self, | |
lr = 0.002 , |
|||
beta_1 = 0.9 , |
|||
beta_2 = 0.999 , |
|||
epsilon = 1e-8 , |
|||
kwargs | |||
) |
Default parameters follow those provided in the paper.
lr | float >= 0. Learning rate. |
beta_1/beta_2 | floats, 0 < beta < 1. Generally close to 1. |
epsilon | float >= 0. Fuzz factor. |
kwargs | see Optimizer.__init__ |