tf.keras.layers.LeakyReLU

TensorFlow 1 version View source on GitHub

Class LeakyReLU

Leaky version of a Rectified Linear Unit.

Inherits From: Layer

Aliases: tf.compat.v1.keras.layers.LeakyReLU, tf.compat.v2.keras.layers.LeakyReLU

Used in the tutorials:

It allows a small gradient when the unit is not active: f(x) = alpha * x for x < 0, f(x) = x for x >= 0.

Input shape:

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.

Arguments:

  • alpha: Float >= 0. Negative slope coefficient.

__init__

View source

__init__(
    alpha=0.3,
    **kwargs
)