|TensorFlow 2 version||View source on GitHub|
Computes Concatenated ReLU.
Compat aliases for migration
See Migration guide for more details.
tf.nn.crelu( features, name=None, axis=-1 )
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
||A name for the operation (optional).|
||The axis that the output values are concatenated along. Default is -1.|