tfm.utils.activations.simple_swish

Computes the Swish activation function.

The tf.nn.swish operation uses a custom gradient to reduce memory usage. Since saving custom gradients in SavedModel is currently not supported, and one would not be able to use an exported TF-Hub module for fine-tuning, we provide this wrapper that can allow to select whether to use the native TensorFlow swish operation, or whether to use a customized operation that has uses default TensorFlow gradient computation.

features A Tensor representing preactivation values.

The activation value.