TF 2.0 is out! Get hands-on practice at TF World, Oct 28-31. Use code TF20 for 20% off select passes. Register now

Module: tfa.layers

View source on GitHub

Additional layers that conform to Keras API.


gelu module: Implements GeLU activation.

maxout module: Implementing Maxout layer.

normalizations module

optical_flow module: Tensorflow op performing correlation cost operation.

poincare module: Implementing PoincareNormalize layer.

sparsemax module

wrappers module


class CorrelationCost: Correlation Cost Layer.

class GeLU: Gaussian Error Linear Unit.

class GroupNormalization: Group normalization layer.

class InstanceNormalization: Instance normalization layer.

class Maxout: Applies Maxout to the input.

class PoincareNormalize: Project into the Poincare ball with norm <= 1.0 - epsilon.

class Sparsemax: Sparsemax activation function [1].

class WeightNormalization: This wrapper reparameterizes a layer by decoupling the weight's