GaussianNoise
keras.layers.GaussianNoise(stddev)
Apply additive zero-centered Gaussian noise.
This is useful to mitigate overfitting(you could see it as a form of random data augmentation).Gaussian Noise (GS) is a natural choice as corruption processfor real valued inputs.
As it is a regularization layer, it is only active at training time.
Arguments
- stddev: float, standard deviation of the noise distribution.
Input shape
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)when using this layer as the first layer in a model.
Output shape
Same shape as input.
GaussianDropout
keras.layers.GaussianDropout(rate)
Apply multiplicative 1-centered Gaussian noise.
As it is a regularization layer, it is only active at training time.
Arguments
- rate: float, drop probability (as with
Dropout
). The multiplicative noise will have standard deviationsqrt(rate / (1 - rate))
.
Input shape
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)when using this layer as the first layer in a model.
Output shape
Same shape as input.
References
AlphaDropout
keras.layers.AlphaDropout(rate, noise_shape=None, seed=None)
Applies Alpha Dropout to the input.
Alpha Dropout is a Dropout
that keeps mean and variance of inputsto their original values, in order to ensure the self-normalizing propertyeven after this dropout.Alpha Dropout fits well to Scaled Exponential Linear Unitsby randomly setting activations to the negative saturation value.
Arguments
- rate: float, drop probability (as with
Dropout
). The multiplicative noise will have standard deviationsqrt(rate / (1 - rate))
. - noise_shape: A 1-D
Tensor
of typeint32
, representing the shape for randomly generated keep/drop flags. - seed: A Python integer to use as random seed.
Input shape
Arbitrary. Use the keyword argument input_shape
(tuple of integers, does not include the samples axis)when using this layer as the first layer in a model.
Output shape
Same shape as input.
References