# tf.nn.crelu(features, name=None)

### tf.nn.crelu(features, name=None)

See the guide: Neural Network > Activation Functions

Computes Concatenated ReLU.

Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: https://arxiv.org/abs/1603.05201

#### Args:

• features: A Tensor with type float, double, int32, int64, uint8, int16, or int8.
• name: A name for the operation (optional).

#### Returns:

A Tensor with the same type as features.

Defined in tensorflow/python/ops/nn_ops.py.