ReLU

public class ReLU

Rectified Linear Unit(ReLU) activation.

With default values, this returns the standard ReLU activation: max(x, 0) , the element-wise maximum of 0 and the input tensor.

Modifying default parameters allows you to use non-zero thresholds, change the max value of the activation, and to use a non-zero multiple of the input for values below the threshold.

For example:

     Operand<TFloat32> input = tf.constant(
              new float[] {-10f, -5f, 0.0f, 5f, 10f});

     // With default parameters
     ReLU<TFloat32> relu = new ReLU<>(tf);
     Operand<TFloat32> result = relu.call(input);
     // result is [0.f,  0.f,  0.f,  5.f, 10.f]

     // With alpha = 0.5
     relu = new ReLU<>(tf, 0.5f, ReLU.MAX_VALUE_DEFAULT, ReLU.THRESHOLD_DEFAULT);
     result = relu.call(input);
     // result is [-5.f , -2.5f,  0.f ,  5.f , 10.f]

     // With maxValue = 5
     relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, 5f, ReLU.THRESHOLD_DEFAULT);
     result = relu.call(input);
     // result is [0.f, 0.f, 0.f, 5.f, 5.f]

     // With threshold = 5
     relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, ReLU.MAX_VALUE_DEFAULT, 5f);
     result = relu.call(input);
     // result is [-0.f, -0.f,  0.f,  0.f, 10.f]
 

Constants

float ALPHA_DEFAULT
float MAX_VALUE_DEFAULT
float THRESHOLD_DEFAULT

Public Constructors

ReLU (Ops tf)
Creates a new ReLU with alpha= ALPHA_DEFAULT , maxValue= MAX_VALUE_DEFAULT , threshold= THRESHOLD_DEFAULT ,
ReLU (Ops tf, float alpha, float maxValue, float threshold)
Creates a new ReLU

Public Methods

Operand <T>
call ( Operand <T> input)
Gets the calculation operation for the activation.

Inherited Methods

Constants

public static final float ALPHA_DEFAULT

Constant Value: 0.0

public static final float MAX_VALUE_DEFAULT

Constant Value: NaN

public static final float THRESHOLD_DEFAULT

Constant Value: 0.0

Public Constructors

public ReLU (Ops tf)

Creates a new ReLU with alpha= ALPHA_DEFAULT , maxValue= MAX_VALUE_DEFAULT , threshold= THRESHOLD_DEFAULT ,

Parameters
tf the TensorFlow Ops

public ReLU (Ops tf, float alpha, float maxValue, float threshold)

Creates a new ReLU

Parameters
tf the TensorFlow Ops
alpha governs the slope for values lower than the threshold.
maxValue sets the saturation threshold (the largest value the function will return).
threshold the threshold value of the activation function below which values will be damped or set to zero.

Public Methods

public Operand <T> call ( Operand <T> input)

Gets the calculation operation for the activation.

Parameters
input the input tensor
Returns
  • The operand for the activation