ONE - On-device Neural Engine
Loading...
Searching...
No Matches
package.experimental.train.optimizer.adam.Adam Class Reference
Collaboration diagram for package.experimental.train.optimizer.adam.Adam:

Public Member Functions

None __init__ (self, float learning_rate=0.001, float beta1=0.9, float beta2=0.999, float epsilon=1e-7)
 

Detailed Description

Adam optimizer.

Definition at line 5 of file adam.py.

Constructor & Destructor Documentation

◆ __init__()

None package.experimental.train.optimizer.adam.Adam.__init__ (   self,
float   learning_rate = 0.001,
float   beta1 = 0.9,
float   beta2 = 0.999,
float   epsilon = 1e-7 
)
Initialize the Adam optimizer.

Args:
    learning_rate (float): The learning rate for optimization.
    beta1 (float): Exponential decay rate for the first moment estimates.
    beta2 (float): Exponential decay rate for the second moment estimates.
    epsilon (float): Small constant to prevent division by zero.

Reimplemented from package.experimental.train.optimizer.optimizer.Optimizer.

Definition at line 9 of file adam.py.

13 epsilon: float = 1e-7) -> None:
14 """
15 Initialize the Adam optimizer.
16
17 Args:
18 learning_rate (float): The learning rate for optimization.
19 beta1 (float): Exponential decay rate for the first moment estimates.
20 beta2 (float): Exponential decay rate for the second moment estimates.
21 epsilon (float): Small constant to prevent division by zero.
22 """
23 super().__init__(learning_rate)
24 self.beta1: float = beta1
25 self.beta2: float = beta2
26 self.epsilon: float = epsilon

References package.experimental.train.optimizer.adam.Adam.__init__(), onert::backend::train::optimizer::Adam::Property.beta1, nnfw_adam_option.beta2, onert::backend::train::optimizer::Adam::Property.beta2, locoex::CircleInstanceNorm.epsilon(), luci_interpreter::InstanceNormParams.epsilon, luci_interpreter::RmsNormParams.epsilon, luci::CircleInstanceNorm.epsilon(), luci::CircleRmsNorm.epsilon(), locoex::CircleInstanceNorm.epsilon(), luci::CircleInstanceNorm.epsilon(), luci::CircleRmsNorm.epsilon(), moco::TFFusedBatchNorm.epsilon(), moco::TFFusedBatchNorm.epsilon(), onert_micro::core::L2NormalizationParams.epsilon, onert_micro::OMTrainingContext.epsilon, nnfw_adam_option.epsilon, nnfw::cker::InstanceNormParams.epsilon, nnfw::cker::RmsNormParams.epsilon, nnfw::cker::FusedBatchNormParams.epsilon, onert::backend::train::optimizer::Adam::Property.epsilon, onert::ir::operation::FusedBatchNorm::Param.epsilon, onert::ir::operation::InstanceNorm::Param.epsilon, and onert::ir::operation::RmsNorm::Param.epsilon.

Referenced by package.experimental.train.optimizer.adam.Adam.__init__().


The documentation for this class was generated from the following file: