|View source on GitHub|
y = alpha * ln(1 + exp(x / alpha)) or
tf.contrib.nn.scaled_softplus( x, alpha, clip=None, name=None )
This can be seen as a softplus applied to the scaled input, with the output
appropriately scaled. As
alpha tends to 0,
scaled_softplus(x, alpha) tends
relu(x). The clipping is optional. As alpha->0, scaled_softplus(x, alpha)
tends to relu(x), and scaled_softplus(x, alpha, clip=6) tends to relu6(x).
Tensor, indicating the amount of smoothness. The caller must ensure that
alpha > 0.
clip: (optional) A
Tensor, the upper bound to clip the values.
name: A name for the scope of the operations (optional).
A tensor of the size and type determined by broadcasting of the inputs.