Have a question? Connect with the community at the TensorFlow Forum Visit Forum

Softplus

public class Softplus

Softplus activation function, softplus(x) = log(exp(x) + 1) .

Example Usage:

     Operand<TFloat32> input = tf.constant(
              new float[] {-20f, -1.0f, 0.0f, 1.0f, 20f});
     Softplus<TFloat32> softplus = new Softplus<>(tf);
     Operand<TFloat32> result = softplus.call(input);
     // result is [2.0611537e-09f, 3.1326166e-01f, 6.9314718e-01f,
     //                 1.3132616e+00f, 2.0000000e+01f]
 

Public Constructors

Softplus (Ops tf)
Creates a Softplus activation function.

Public Methods

Operand <T>
call ( Operand <T> input)
Gets the calculation operation for the activation.

Inherited Methods

Public Constructors

public Softplus (Ops tf)

Creates a Softplus activation function.

Parameters
tf the TensorFlow Ops

Public Methods

public Operand <T> call ( Operand <T> input)

Gets the calculation operation for the activation.

Parameters
input the input tensor
Returns
  • The operand for the activation