Computes Concatenated ReLU.
features, axis=-1, name=None
Concatenates a ReLU which selects only the positive part of the activation
with a ReLU which selects only the negative part of the activation.
Note that as a result this non-linearity doubles the depth of the activations.
Source: Understanding and Improving Convolutional Neural Networks via
Concatenated Rectified Linear Units. W. Shang, et
Tensor with type
A name for the operation (optional).
The axis that the output values are concatenated along. Default is -1.
Tensor with the same type as
Understanding and Improving Convolutional Neural Networks via Concatenated
Rectified Linear Units:
Shang et al., 2016