Module: tfa.layers.multihead_attention

View source on GitHub

Classes

class MultiHeadAttention: MultiHead Attention layer.