AIfES 2  2.0.0
aiopti_adam_default.h File Reference

Default implementation of the Adam optimizer . More...

Go to the source code of this file.

Data Structures

struct  aiopti_adam_f32
 Data-type specific Adam optimizer struct for F32 . More...
 

Typedefs

typedef struct aiopti_adam_f32 aiopti_adam_f32_t
 

Functions

aiopti_taiopti_adam_f32_default (aiopti_adam_f32_t *opti)
 Initializes an Adam optimizer with the F32 default implementation. More...
 
void aiopti_adam_f32_default_begin_step (aiopti_t *self)
 F32 default implementation of the aiopti.begin_step function for ADAM More...
 
void aiopti_adam_f32_default_end_step (aiopti_t *self)
 F32 default implementation of the aiopti.end_step function for ADAM More...
 

Detailed Description

Default implementation of the Adam optimizer .

Version
2.2.0

Hardware independent implementations of the Adam optimizer in F32 , Q31 data-type. For more information about the Adam optimizer refer to aiopti_adam.h.

Function Documentation

◆ aiopti_adam_f32_default()

aiopti_t* aiopti_adam_f32_default ( aiopti_adam_f32_t opti)

Initializes an Adam optimizer with the F32 default implementation.

Example: Create the optimizer structure:
In C:

aiopti_adam_f32_t adam_optimizer = {
.learning_rate = 0.01f,
.beta1 = 0.9f,
.beta2 = 0.999f,
.eps = 1e-7f
};
Data-type specific Adam optimizer struct for F32 .
Definition: aiopti_adam_default.h:45
aiscalar_f32_t learning_rate
Storage for aiopti.learning_rate scalar in F32.
Definition: aiopti_adam_default.h:54

In C, C++ and on Arduino:

aiopti_adam_f32_t adam_optimizer = AIOPTI_ADAM_F32(0.01f, 0.9f, 0.999f, 1e-7f);

Example: Initialize the optimizer:

aiopti_t *optimizer;
optimizer = aiopti_adam_f32_default(&adam_optimizer);
aiopti_t * aiopti_adam_f32_default(aiopti_adam_f32_t *opti)
Initializes an Adam optimizer with the F32 default implementation.
AIfES optimizer interface.
Definition: aifes_core.h:438
Parameters
*optiThe optimizer structure to initialize.
Returns
The (successfully) initialized optimizer structure.

◆ aiopti_adam_f32_default_begin_step()

void aiopti_adam_f32_default_begin_step ( aiopti_t self)

F32 default implementation of the aiopti.begin_step function for ADAM

Implementation of aiopti.begin_step.

The ADAM optimizer needs some modification of the learning rate in every optimization step. This function deals with aiscalars and has to be implemented for every data-type individually.

The calculations are:

\[ lr_t \leftarrow lr \cdot \frac{\sqrt{1 - \beta^t_2}} {(1 - \beta_1)^t} \]

This is not primary time critical function, because it only deals with scalars -> No special hardware implementation necessary (but possible).

Parameters
*selfThe optimizer structure

◆ aiopti_adam_f32_default_end_step()

void aiopti_adam_f32_default_end_step ( aiopti_t self)

F32 default implementation of the aiopti.end_step function for ADAM

Implementation of aiopti.end_step.

The ADAM optimizer needs some modification of the learning rate in every optimization step. This function deals with aiscalars and has to be implemented for every data-type individually.

The calculations are:

\[ \beta^t_1 \leftarrow \beta^t_1 \cdot \beta_1 \]

\[ \beta^t_2 \leftarrow \beta^t_2 \cdot \beta_2 \]

This is not primary time critical function, because it only deals with scalars -> No special hardware implementation necessary (but possible).

Parameters
*selfThe optimizer structure