Help protect the Great Barrier Reef with TensorFlow on Kaggle

Module: tf.keras.activations

Built-in activation functions.

Functions

`deserialize(...)`: Returns activation function denoted by input string.

`elu(...)`: Exponential linear unit.

`exponential(...)`: Exponential activation function.

`get(...)`: Returns function.

`hard_sigmoid(...)`: Hard sigmoid activation function.

`linear(...)`: Linear activation function.

`relu(...)`: Applies the rectified linear unit activation function.

`selu(...)`: Scaled Exponential Linear Unit (SELU).

`serialize(...)`: Returns name attribute (`__name__`) of function.

`sigmoid(...)`: Sigmoid activation function.

`softmax(...)`: Softmax converts a real vector to a vector of categorical probabilities.

`softplus(...)`: Softplus activation function.

`softsign(...)`: Softsign activation function.

`tanh(...)`: Hyperbolic tangent activation function.

[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"Missing the information I need" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"Too complicated / too many steps" },{ "type": "thumb-down", "id": "outOfDate", "label":"Out of date" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"Other" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"Easy to understand" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"Solved my problem" },{ "type": "thumb-up", "id": "otherUp", "label":"Other" }]