Module: tf.keras.ops.nn

DO NOT EDIT.

This file was autogenerated. Do not edit it by hand, since your modifications would be overwritten.

Functions

average_pool(...): Average pooling operation.

batch_normalization(...): Normalizes x by mean and variance.

binary_crossentropy(...): Computes binary cross-entropy loss between target and output tensor.

categorical_crossentropy(...): Computes categorical cross-entropy loss between target and output tensor.

conv(...): General N-D convolution.

conv_transpose(...): General N-D convolution transpose.

ctc_decode(...): Decodes the output of a CTC model.

ctc_loss(...): CTC (Connectionist Temporal Classification) loss.

depthwise_conv(...): General N-D depthwise convolution.

elu(...): Exponential Linear Unit activation function.

gelu(...): Gaussian Error Linear Unit (GELU) activation function.

hard_sigmoid(...): Hard sigmoid activation function.

hard_silu(...): Hard SiLU activation function, also known as Hard Swish.

hard_swish(...): Hard SiLU activation function, also known as Hard Swish.

leaky_relu(...): Leaky version of a Rectified Linear Unit activation function.

log_sigmoid(...): Logarithm of the sigmoid activation function.

log_softmax(...): Log-softmax activation function.

max_pool(...): Max pooling operation.

moments(...): Calculates the mean and variance of x.

multi_hot(...): Encodes integer labels as multi-hot vectors.

normalize(...): Normalizes x over the specified axis.

one_hot(...): Converts integer tensor x into a one-hot tensor.

relu(...): Rectified linear unit activation function.

relu6(...): Rectified linear unit activation function with upper bound of 6.

selu(...): Scaled Exponential Linear Unit (SELU) activation function.

separable_conv(...): General N-D separable convolution.

sigmoid(...): Sigmoid activation function.

silu(...): Sigmoid Linear Unit (SiLU) activation function, also known as Swish.

softmax(...): Softmax activation function.

softplus(...): Softplus activation function.

softsign(...): Softsign activation function.

sparse_categorical_crossentropy(...): Computes sparse categorical cross-entropy loss.

swish(...): Sigmoid Linear Unit (SiLU) activation function, also known as Swish.