Module: tfa.layers

View source on GitHub

Additional layers that conform to Keras API.


adaptive_pooling module: Pooling layers with fixed size outputs

esn module: Implements Echo State recurrent Network (ESN) layer.

gelu module: Implements GELU activation.

maxout module: Implementing Maxout layer.

multihead_attention module

normalizations module

optical_flow module: Tensorflow op performing correlation cost operation.

poincare module: Implementing PoincareNormalize layer.

polynomial module: Implements Polynomial Crossing Layer.

snake module: Implements Snake layer.

sparsemax module

spatial_pyramid_pooling module: Spatial Pyramid Pooling layers

spectral_normalization module

tlu module: Implements Thresholded Linear Unit.

wrappers module


class AdaptiveAveragePooling1D: Average Pooling with adaptive kernel size.

class AdaptiveAveragePooling2D: Average Pooling with adaptive kernel size.

class AdaptiveAveragePooling3D: Average Pooling with adaptive kernel size.

class AdaptiveMaxPooling1D: Max Pooling with adaptive kernel size.

class AdaptiveMaxPooling2D: Max Pooling with adaptive kernel size.

class AdaptiveMaxPooling3D: Max Pooling with adaptive kernel size.

class CorrelationCost: Correlation Cost Layer.

class ESN: Echo State Network layer.

class FilterResponseNormalization: Filter response normalization layer.

class GELU: Gaussian Error Linear Unit.

class GroupNormalization: Group normalization layer.

class InstanceNormalization: Instance normalization layer.

class Maxout: Applies Maxout to the input.

class MultiHeadAttention: MultiHead Attention layer.

class PoincareNormalize: Project into the Poincare ball with norm <= 1.0 - epsilon.

class PolynomialCrossing: Layer for Deep & Cross Network to learn explicit feature interactions.

class Snake: Snake layer to learn periodic functions with the trainable frequency scalar.

class Sparsemax: Sparsemax activation function.

class SpatialPyramidPooling2D: Performs Spatial Pyramid Pooling.

class SpectralNormalization: Performs spectral normalization on weights.

class TLU: Thresholded Linear Unit.

class WeightNormalization: Performs weight normalization.