This can be seen as a softplus applied to the scaled input, with the output
appropriately scaled. As alpha tends to 0, scaled_softplus(x, alpha) tends
to relu(x). The clipping is optional. As alpha->0, scaled_softplus(x, alpha)
tends to relu(x), and scaled_softplus(x, alpha, clip=6) tends to relu6(x).
Args
x
A Tensor of inputs.
alpha
A Tensor, indicating the amount of smoothness. The caller
must ensure that alpha > 0.
clip
(optional) A Tensor, the upper bound to clip the values.
name
A name for the scope of the operations (optional).
Returns
A tensor of the size and type determined by broadcasting of the inputs.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[]]