enum dnnl::normalization_flags¶
Overview¶
Flags for normalization primitives. More…
#include <dnnl.hpp> enum normalization_flags { none = dnnl_normalization_flags_none, use_global_stats = dnnl_use_global_stats, use_scale = dnnl_use_scale, use_shift = dnnl_use_shift, fuse_norm_relu = dnnl_fuse_norm_relu, fuse_norm_add_relu = dnnl_fuse_norm_add_relu, rms_norm = dnnl_rms_norm, };
Detailed Documentation¶
Flags for normalization primitives.
Enum Values¶
none
Use no normalization flags.
If specified, the library computes mean and variance on forward propagation for training and inference, outputs them on forward propagation for training, and computes the respective derivatives on backward propagation.
Note
Backward propagation of type dnnl::prop_kind::backward_data has the same behavior as dnnl::prop_kind::backward.
use_global_stats
Use global statistics.
If specified, the library uses mean and variance provided by the user as an input on forward propagation and does not compute their derivatives on backward propagation. Otherwise, the library computes mean and variance on forward propagation for training and inference, outputs them on forward propagation for training, and computes the respective derivatives on backward propagation.
use_scale
Use scale parameter.
If specified, the user is expected to pass scale as input on forward propagation. On backward propagation of type dnnl::prop_kind::backward, the library computes its derivative.
use_shift
Use shift parameter.
If specified, the user is expected to pass shift as input on forward propagation. On backward propagation of type dnnl::prop_kind::backward, the library computes its derivative.
fuse_norm_relu
Fuse normalization with ReLU.
On training, normalization will require the workspace to implement backward propagation. On inference, the workspace is not required and behavior is the same as when normalization is fused with ReLU using the post-ops API.
Note
The flag implies negative slope being 0. On training this is the only configuration supported. For inference, to use non-zero negative slope consider using Primitive Attributes: Post-ops.
fuse_norm_add_relu
Fuse normalization with an elementwise binary Add operation followed by ReLU.
During training, normalization will require a workspace to implement backward propagation. For inference, the workspace is not needed. On forward propagation, an elementwise binary Add operation is applied to the normalization results with an additional input tensor, followed by ReLU with a negative slope of 0. On backward propagation, the result of the backward ReLU operation with the input tensor and workspace from the forward pass is saved to an extra output tensor, and backward normalization is performed.
rms_norm
Use Root Mean Square (RMS) Normalization.
In forward propagation, the mean is considered zero, and RMS norm is used instead of variance for scaling. Only the RMS norm is output during forward propagation for training. In backward propagation, the library calculates the derivative with respect to the RMS norm only, assuming the mean is zero.
Note
When used with dnnl::normalization_flags::use_global_stats, only RMS norm is required to be provided as input.