¡Reserva! Google I / O regresa del 18 al 20 de mayo

# Module: tf.keras.activations

Built-in activation functions.

## Functions

`deserialize(...)`: Returns activation function given a string identifier.

`elu(...)`: Exponential Linear Unit.

`exponential(...)`: Exponential activation function.

`gelu(...)`: Applies the Gaussian error linear unit (GELU) activation function.

`get(...)`: Returns function.

`hard_sigmoid(...)`: Hard sigmoid activation function.

`linear(...)`: Linear activation function (pass-through).

`relu(...)`: Applies the rectified linear unit activation function.

`selu(...)`: Scaled Exponential Linear Unit (SELU).

`serialize(...)`: Returns the string identifier of an activation function.

`sigmoid(...)`: Sigmoid activation function, `sigmoid(x) = 1 / (1 + exp(-x))`.

`softmax(...)`: Softmax converts a real vector to a vector of categorical probabilities.

`softplus(...)`: Softplus activation function, `softplus(x) = log(exp(x) + 1)`.

`softsign(...)`: Softsign activation function, `softsign(x) = x / (abs(x) + 1)`.

`swish(...)`: Swish activation function, `swish(x) = x * sigmoid(x)`.

`tanh(...)`: Hyperbolic tangent activation function.

[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"Falta la información que necesito" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"Muy complicado o demasiados pasos" },{ "type": "thumb-down", "id": "outOfDate", "label":"Desactualizado" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"Otro" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"Fácil de comprender" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"Resolvió mi problema" },{ "type": "thumb-up", "id": "otherUp", "label":"Otro" }]