# ReLU

public class ReLU

Rectified Linear Unit(ReLU) activation.

With default values, this returns the standard ReLU activation: ``` max(x, 0) ``` , the element-wise maximum of 0 and the input tensor.

Modifying default parameters allows you to use non-zero thresholds, change the max value of the activation, and to use a non-zero multiple of the input for values below the threshold.

For example:

```     Operand<TFloat32> input = tf.constant(
new float[] {-10f, -5f, 0.0f, 5f, 10f});

// With default parameters
ReLU<TFloat32> relu = new ReLU<>(tf);
Operand<TFloat32> result = relu.call(input);
// result is [0.f,  0.f,  0.f,  5.f, 10.f]

// With alpha = 0.5
relu = new ReLU<>(tf, 0.5f, ReLU.MAX_VALUE_DEFAULT, ReLU.THRESHOLD_DEFAULT);
result = relu.call(input);
// result is [-5.f , -2.5f,  0.f ,  5.f , 10.f]

// With maxValue = 5
relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, 5f, ReLU.THRESHOLD_DEFAULT);
result = relu.call(input);
// result is [0.f, 0.f, 0.f, 5.f, 5.f]

// With threshold = 5
relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, ReLU.MAX_VALUE_DEFAULT, 5f);
result = relu.call(input);
// result is [-0.f, -0.f,  0.f,  0.f, 10.f]
```

### Constants

 float ALPHA_DEFAULT float MAX_VALUE_DEFAULT float THRESHOLD_DEFAULT

### Public Constructors

 (Ops tf) Creates a new ReLU with alpha= ``` ALPHA_DEFAULT ``` , maxValue= ``` MAX_VALUE_DEFAULT ``` , threshold= ``` THRESHOLD_DEFAULT ``` , (Ops tf, float alpha, float maxValue, float threshold) Creates a new ReLU

### Public Methods

 Operand ( Operand input) Gets the calculation operation for the activation.

## Constants

#### public static final float ALPHA_DEFAULT

Constant Value: 0.0

#### public static final float MAX_VALUE_DEFAULT

Constant Value: NaN

#### public static final float THRESHOLD_DEFAULT

Constant Value: 0.0

## Public Constructors

#### public ReLU (Ops tf)

Creates a new ReLU with alpha= ``` ALPHA_DEFAULT ``` , maxValue= ``` MAX_VALUE_DEFAULT ``` , threshold= ``` THRESHOLD_DEFAULT ``` ,

##### Parameters
 tf the TensorFlow Ops

#### public ReLU (Ops tf, float alpha, float maxValue, float threshold)

Creates a new ReLU

##### Parameters
 tf the TensorFlow Ops governs the slope for values lower than the threshold. sets the saturation threshold (the largest value the function will return). the threshold value of the activation function below which values will be damped or set to zero.

## Public Methods

#### public Operand <T> call ( Operand <T> input)

Gets the calculation operation for the activation.

##### Parameters
 input the input tensor
##### Returns
• The operand for the activation
[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"Missing the information I need" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"Too complicated / too many steps" },{ "type": "thumb-down", "id": "outOfDate", "label":"Out of date" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"Other" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"Easy to understand" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"Solved my problem" },{ "type": "thumb-up", "id": "otherUp", "label":"Other" }]