Attend the Women in ML Symposium on December 7 Register now


Stay organized with collections Save and categorize content based on your preferences.

Creates a feed-forward network as tf.keras.Sequential.

It creates a feed-forward network with batch normalization and dropout, and optionally applies batch normalization on inputs.

Example usage:

tower = create_tower(hidden_layer_dims=[64, 32, 16], output_units=1)
inputs = tf.ones([2, 3, 1])
tower_logits = tower(inputs)

hidden_layer_dims Iterable of number hidden units per layer. All layers are fully connected. Ex. [64, 32] means first layer has 64 nodes and second one has 32.
output_units Size of output logits from this tower.
activation Activation function applied to each layer. If None, will use an identity activation.
input_batch_norm Whether to use batch normalization for input layer.
use_batch_norm Whether to use batch normalization after each hidden layer.
batch_norm_moment Momentum for the moving average in batch normalization.
dropout When not None, the probability we will drop out a given coordinate.
name Name of the Keras layer.
**kwargs Keyword arguments for every tf.keras.Dense layers.

A tf.keras.Sequential object.