tf.contrib.seq2seq.prepare_attention(attention_states, attention_option, num_units, reuse=False)

tf.contrib.seq2seq.prepare_attention(attention_states, attention_option, num_units, reuse=False)

Prepare keys/values/functions for attention.

Args:

  • attention_states: hidden states to attend over.
  • attention_option: how to compute attention, either "luong" or "bahdanau".
  • num_units: hidden state dimension.
  • reuse: whether to reuse variable scope.

Returns:

  • attention_keys: to be compared with target states.
  • attention_values: to be used to construct context vectors.
  • attention_score_fn: to compute similarity between key and target states.
  • attention_construct_fn: to build attention states.

Defined in tensorflow/contrib/seq2seq/python/ops/attention_decoder_fn.py.