Adamax

public class Adamax

Optimizer that implements the Adamax algorithm.

It is a variant of Adam based on the infinity norm. Default parameters follow those provided in the paper. Adamax is sometimes superior to adam, specially in models with embeddings.

Constants

float BETA_ONE_DEFAULT
float BETA_TWO_DEFAULT
float EPSILON_DEFAULT
String FIRST_MOMENT
float LEARNING_RATE_DEFAULT
String SECOND_MOMENT

Inherited Constants

Public Constructors

Adamax ( Graph graph)
Creates an Optimizer that implements the Adamax algorithm.
Adamax ( Graph graph, String name)
Creates an Optimizer that implements the Adamax algorithm.
Adamax ( Graph graph, float learningRate)
Creates an Optimizer that implements the Adamax algorithm.
Adamax ( Graph graph, String name, float learningRate)
Creates an Optimizer that implements the Adamax algorithm.
Adamax ( Graph graph, float learningRate, float betaOne, float betaTwo, float epsilon)
Creates an Optimizer that implements the Adamax algorithm.
Adamax ( Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon)
Creates an Optimizer that implements the Adamax algorithm.

Public Methods

String
getOptimizerName ()
Get the Name of the optimizer.

Inherited Methods

Constants

public static final float BETA_ONE_DEFAULT

Constant Value: 0.9

public static final float BETA_TWO_DEFAULT

Constant Value: 0.999

public static final float EPSILON_DEFAULT

Constant Value: 1.0E-7

public static final String FIRST_MOMENT

Constant Value: "m"

public static final float LEARNING_RATE_DEFAULT

Constant Value: 0.001

public static final String SECOND_MOMENT

Constant Value: "v"

Public Constructors

public Adamax ( Graph graph)

Creates an Optimizer that implements the Adamax algorithm.

Parameters
graph the TensorFlow graph

public Adamax ( Graph graph, String name)

Creates an Optimizer that implements the Adamax algorithm.

Parameters
graph the TensorFlow graph
name name for the operations Created when applying gradients. Defaults to "Adamax".

public Adamax ( Graph graph, float learningRate)

Creates an Optimizer that implements the Adamax algorithm.

Parameters
graph the TensorFlow graph
learningRate The learning rate.

public Adamax ( Graph graph, String name, float learningRate)

Creates an Optimizer that implements the Adamax algorithm.

Parameters
graph the TensorFlow graph
name name for the operations Created when applying gradients. Defaults to "Adamax".
learningRate The learning rate.

public Adamax ( Graph graph, float learningRate, float betaOne, float betaTwo, float epsilon)

Creates an Optimizer that implements the Adamax algorithm.

Parameters
graph the TensorFlow graph
learningRate The learning rate.
betaOne The exponential decay rate for the 1st moment estimates.
betaTwo The exponential decay rate for the exponentially weighted infinity norm.
epsilon A small constant for numerical stability.

public Adamax ( Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon)

Creates an Optimizer that implements the Adamax algorithm.

Parameters
graph the TensorFlow graph
name name for the operations Created when applying gradients. Defaults to "Adamax".
learningRate The learning rate.
betaOne The exponential decay rate for the 1st moment estimates.
betaTwo The exponential decay rate for the exponentially weighted infinity norm.
epsilon A small constant for numerical stability.

Public Methods

public String getOptimizerName ()

Get the Name of the optimizer.

Returns
  • The optimizer name.