![]() |
Transformer layer with ReZero.
tfm.nlp.layers.ReZeroTransformer(
num_attention_heads,
inner_dim=768,
inner_activation=tfm.utils.activations.gelu
,
dropout_rate=0.0,
attention_dropout_rate=0.0,
output_range=None,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
use_layer_norm=False,
share_rezero=True,
**kwargs
)
This layer implements the Transformer from "Attention Is All You Need". (https://arxiv.org/abs/1706.03762). The residual connection implements the ReZero method. (https://arxiv.org/abs/2003.04887)
Methods
call
call(
inputs, output_range: Optional[tf.Tensor] = None
) -> tf.Tensor
This is where the layer's logic lives.
The call()
method may not create state (except in its first
invocation, wrapping the creation of variables or other resources in
tf.init_scope()
). It is recommended to create state, including
tf.Variable
instances and nested Layer
instances,
in __init__()
, or in the build()
method that is
called automatically before call()
executes for the first time.
Args | |
---|---|
inputs
|
Input tensor, or dict/list/tuple of input tensors.
The first positional inputs argument is subject to special rules:
|
*args
|
Additional positional arguments. May contain tensors, although this is not recommended, for the reasons above. |
**kwargs
|
Additional keyword arguments. May contain tensors, although
this is not recommended, for the reasons above.
The following optional keyword arguments are reserved:
training : Boolean scalar tensor of Python boolean indicating
whether the call is meant for training or inference.mask : Boolean input mask. If the layer's call() method takes a
mask argument, its default value will be set to the mask
generated for inputs by the previous layer (if input did come
from a layer that generated a corresponding mask, i.e. if it came
from a Keras layer with masking support).
|
Returns | |
---|---|
A tensor or list/tuple of tensors. |
reset_rezero
reset_rezero()