Module: tfa.layers

View source on GitHub

Additional layers that conform to Keras API.

Modules

adaptive_pooling module: Pooling layers with fixed size outputs

gelu module: Implements GELU activation.

maxout module: Implementing Maxout layer.

multihead_attention module

normalizations module

optical_flow module: Tensorflow op performing correlation cost operation.

poincare module: Implementing PoincareNormalize layer.

polynomial module: Implements Polynomial Crossing Layer.

sparsemax module

tlu module: Implements Thresholded Linear Unit.

wrappers module

Classes

class AdaptiveAveragePooling1D: Average Pooling with adaptive kernel size.

class AdaptiveAveragePooling2D: Average Pooling with adaptive kernel size.

class AdaptiveAveragePooling3D: Average Pooling with adaptive kernel size.

class AdaptiveMaxPooling1D: Max Pooling with adaptive kernel size.

class AdaptiveMaxPooling2D: Max Pooling with adaptive kernel size.

class AdaptiveMaxPooling3D: Max Pooling with adaptive kernel size.

class CorrelationCost: Correlation Cost Layer.

class FilterResponseNormalization: Filter response normalization layer.

class GELU: Gaussian Error Linear Unit.

class GroupNormalization: Group normalization layer.

class InstanceNormalization: Instance normalization layer.

class Maxout: Applies Maxout to the input.

class MultiHeadAttention: MultiHead Attention layer.

class PoincareNormalize: Project into the Poincare ball with norm <= 1.0 - epsilon.

class PolynomialCrossing: Layer for Deep & Cross Network to learn explicit feature interactions.

class Sparsemax: Sparsemax activation function [1].

class TLU: Thresholded Linear Unit. An activation function which is similar to ReLU

class WeightNormalization: This wrapper reparameterizes a layer by decoupling the weight's