org.tensorflow.framework.activations

Stay organized with collections Save and categorize content based on your preferences.

Classes

Activation<T extends TNumber> Abstract base class for Activations

Note: The ERROR(/#tf) attribute must be set prior to invoking the call method. 

ELU<T extends TFloating> Exponential linear unit. 
Exponential<T extends TFloating> Exponential activation function. 
HardSigmoid<T extends TFloating> Hard sigmoid activation. 
Linear<U extends TNumber> Linear activation function (pass-through). 
ReLU<T extends TNumber> Rectified Linear Unit(ReLU) activation. 
SELU<T extends TFloating> Scaled Exponential Linear Unit (SELU). 
Sigmoid<T extends TFloating> Sigmoid activation. 
Softmax<T extends TFloating> Softmax converts a real vector to a vector of categorical probabilities. 
Softplus<T extends TFloating> Softplus activation function, softplus(x) = log(exp(x) + 1)
Softsign<T extends TFloating> Softsign activation function, softsign(x) = x / (abs(x) + 1)
Swish<T extends TFloating> Swish activation function. 
Tanh<T extends TFloating> Hyperbolic tangent activation function.