LeakyReLU#

General#

LeakyReLU operation is a type of activation function based on ReLU. It has a small slope for negative values with which LeakyReLU can produce small, non-zero, and constant gradients with respect to the negative values. The slope is also called the coefficient of leakage.

Unlike PReLU, the coefficient \(\alpha\) is constant and defined before training.

LeakyReLU operation applies following formula on every element of \(\src\) tensor (the variable names follow the standard Naming Conventions):

\[\begin{split}dst = \begin{cases} src & \text{if}\ src \ge 0 \\ \alpha src & \text{if}\ src < 0 \end{cases}\end{split}\]

Operation attributes#

Attribute Name

Descr

alpha

Alpha is the coefficient of leakage.

f32

Arbitrary f32 value but usually a small positive value.

Required

Execution arguments#

The inputs and outputs must be provided according to below index order when constructing an operation.

Inputs#

Index

Argu

0

src

Required

Outputs#

Index

Argu

0

dst

Required

Supported data types#

LeakyReLU operation supports the following data type combinations.

Src

D

f32

f32

bf16

bf16

f16

f16