ONE - On-device Neural Engine
|
Public Member Functions | |
__init__ (self, learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-7) | |
Data Fields | |
beta1 | |
beta2 | |
epsilon | |
![]() | |
learning_rate | |
nums_trainable_ops | |
optimizer.adam.Adam.__init__ | ( | self, | |
learning_rate = 0.001 , |
|||
beta1 = 0.9 , |
|||
beta2 = 0.999 , |
|||
epsilon = 1e-7 |
|||
) |
Initialize the Adam optimizer. Args: learning_rate (float): The learning rate for optimization. beta1 (float): Exponential decay rate for the first moment estimates. beta2 (float): Exponential decay rate for the second moment estimates. epsilon (float): Small constant to prevent division by zero.
Reimplemented from optimizer.optimizer.Optimizer.
Definition at line 8 of file adam.py.
References optimizer.adam.Adam.__init__().
Referenced by optimizer.adam.Adam.__init__().