Have a question? Connect with the community at the TensorFlow Forum Visit Forum

Selu

public final class Selu

Computes scaled exponential linear: `scale * alpha * (exp(features) - 1)`

if < 0, `scale * features` otherwise.

To be used together with `initializer = tf.variance_scaling_initializer(factor=1.0, mode='FAN_IN')`. For correct dropout, use `tf.contrib.nn.alpha_dropout`.

See [Self-Normalizing Neural Networks](https://arxiv.org/abs/1706.02515)

Constants

String OP_NAME The name of this op, as known by TensorFlow core engine

Public Methods

Output <T>
Output <T>
asOutput ()
Returns the symbolic handle of the tensor.
static <T extends TNumber > Selu <T>
create ( Scope scope, Operand <T> features)
Factory method to create a class wrapping a new Selu operation.

Inherited Methods

Constants

public static final String OP_NAME

The name of this op, as known by TensorFlow core engine

Constant Value: "Selu"

Public Methods

public Output <T> activations ()

public Output <T> asOutput ()

Returns the symbolic handle of the tensor.

Inputs to TensorFlow operations are outputs of another TensorFlow operation. This method is used to obtain a symbolic handle that represents the computation of the input.

public static Selu <T> create ( Scope scope, Operand <T> features)

Factory method to create a class wrapping a new Selu operation.

Parameters
scope current scope
Returns
  • a new instance of Selu