tf.keras.layers.Bidirectional

TensorFlow 1 version View source on GitHub

Bidirectional wrapper for RNNs.

Inherits From: Wrapper

tf.keras.layers.Bidirectional(
    layer, merge_mode='concat', weights=None, backward_layer=None, **kwargs
)

Used in the notebooks

Used in the guide Used in the tutorials

Arguments:

  • layer: keras.layers.RNN instance, such as keras.layers.LSTM or keras.layers.GRU. It could also be a keras.layers.Layer instance that meets the following criteria:
    1. Be a sequence-processing layer (accepts 3D+ inputs).
    2. Have a go_backwards, return_sequences and return_state attribute (with the same semantics as for the RNN class).
    3. Have an input_spec attribute.
    4. Implement serialization via get_config() and from_config(). Note that the recommended way to create new RNN layers is to write a custom RNN cell and use it with keras.layers.RNN, instead of subclassing keras.layers.Layer directly.
  • merge_mode: Mode by which outputs of the forward and backward RNNs will be combined. One of {'sum', 'mul', 'concat', 'ave', None}. If None, the outputs will not be combined, they will be returned as a list. Default value is 'concat'.
  • backward_layer: Optional keras.layers.RNN, or keras.layers.Layerinstance to be used to handle backwards input processing. Ifbackward_layeris not provided, the layer instance passed as thelayerargument will be used to generate the backward layer automatically. Note that the providedbackward_layerlayer should have properties matching those of thelayerargument, in particular it should have the same values forstateful,return_states,return_sequence, etc. In addition,backward_layerandlayershould have differentgo_backwardsargument values. AValueError` will be raised if these requirements are not met.

Call arguments:

The call arguments for this layer are the same as those of the wrapped RNN layer.

Raises:

  • ValueError: 1. If layer or backward_layer is not a Layer instance.
    1. In case of invalid merge_mode argument.
    2. If backward_layer has mismatched properties compared to layer.

Examples:

model = Sequential()
model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))
model.add(Bidirectional(LSTM(10)))
model.add(Dense(5))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')

 # With custom backward layer
 model = Sequential()
 forward_layer = LSTM(10, return_sequences=True)
 backward_layer = LSTM(10, activation='relu', return_sequences=True,
                       go_backwards=True)
 model.add(Bidirectional(forward_layer, backward_layer=backward_layer,
                         input_shape=(5, 10)))
 model.add(Dense(5))
 model.add(Activation('softmax'))
 model.compile(loss='categorical_crossentropy', optimizer='rmsprop')

Attributes:

  • constraints

Methods

reset_states

View source

reset_states()