AIfES 2
2.0.0
|
Base layer implementation of the ReLU activation layer. More...
Go to the source code of this file.
Data Structures | |
struct | ailayer_relu |
General ReLU layer struct. More... | |
Typedefs | |
typedef struct ailayer_relu | ailayer_relu_t |
Functions | |
ailayer_t * | ailayer_relu (ailayer_relu_t *layer, ailayer_t *input_layer) |
Initialize and connect the given ReLU layer. More... | |
void | ailayer_relu_forward (ailayer_t *self) |
Calculate the forward pass for given ReLU layer. More... | |
void | ailayer_relu_backward (ailayer_t *self) |
Calculate the backward pass for the given ReLU layer. More... | |
void | ailayer_relu_calc_result_shape (ailayer_t *self) |
Calculate the shape of the result tensor. More... | |
void | ailayer_relu_print_specs (const ailayer_t *self) |
Print the layer specification. More... | |
Variables | |
const aicore_layertype_t * | ailayer_relu_type |
ReLU layer type. More... | |
Base layer implementation of the ReLU activation layer.
This is an "abstract" data-type independent implementation. To use the layer use one of the provided implementations for a specific hardware and data-type (for example from ailayer_relu_default.h) or set the required math functions on your own.
The ReLU layer is used as an activation function layer right after a dense layer. It calculates
\[ y = \begin{cases} 0 & \text{if } x < 0\\ x & \text{if } x \geq 0 \end{cases} \]
for every element of the input tensor.
The results of the forward pass of this layer are written to the result tensor of the base ailayer_t struct.
ailayer_t* ailayer_relu | ( | ailayer_relu_t * | layer, |
ailayer_t * | input_layer | ||
) |
Initialize and connect the given ReLU layer.
This function represents the "constructor" of the abstract ReLU layer. It initializes the layer structure and connects it to the previous layer.
This function is not intended to call it directly. Instead use one of the data type specific implementations (like for example ailayer_relu_f32_default()).
*layer | The layer to initialize. |
*input_layer | The previous layer that provides the inputs to the layer. |
void ailayer_relu_backward | ( | ailayer_t * | self | ) |
Calculate the backward pass for the given ReLU layer.
Implementation of ailayer.backward.
It uses the deltas tensor of the next layer as input and writes the result of the backward pass to the deltas tensor (ailayer.deltas) of the given layer.
Calculation of the errors for the previous layer:
\[ \delta_{in} \leftarrow \delta_{out} \circ ReLU'(x_{in}) \]
\( x_{in} \): Result of the forward pass of the previous layer
\( \delta_{in} \): Result of the backward pass of this layer
\( \delta_{out} \): Result of the backward pass of the next layer
Used math functions:
*self | Layer to calculate the backward path for. |
void ailayer_relu_calc_result_shape | ( | ailayer_t * | self | ) |
Calculate the shape of the result tensor.
Implementation of ailayer.calc_result_shape.
As the result tensor shape is shared with the result tensor shape of the previous layer (no change in shape is needed), this function returns without doing anything.
*self | Layer to calculate the resulting shape for. |
void ailayer_relu_forward | ( | ailayer_t * | self | ) |
Calculate the forward pass for given ReLU layer.
Implementation of ailayer.forward.
It uses the result tensor of the previous layer as input and writes the result of the forward pass to the result tensor (ailayer.result) of the given layer.
Calculation of the forward pass result:
\[ x_{out} \leftarrow ReLU(x_{in}) \]
\( x_{in} \): Result of the forward pass of the previous layer
\( x_{out} \): Result of the forward pass of this layer
Used math functions:
*self | Layer to calculate the forward path for. |
void ailayer_relu_print_specs | ( | const ailayer_t * | self | ) |
Print the layer specification.
*self | The layer to print the specification for |
|
extern |
ReLU layer type.
Defines the type of the layer (for example for type checks and debug prints). See aicore_layertype for more information about the layer type.