org.tensorflow.op.train

Classes

AccumulatorApplyGradient Applies a gradient to a given accumulator.
AccumulatorNumAccumulated Returns the number of gradients aggregated in the given accumulators.
AccumulatorSetGlobalStep Updates the accumulator with a new value for global_step.
AccumulatorTakeGradient <T extends TType > Extracts the average gradient in the given ConditionalAccumulator.
ApplyAdadelta <T extends TType > Update '*var' according to the adadelta scheme.
ApplyAdadelta.Options Optional attributes for ApplyAdadelta
ApplyAdagrad <T extends TType > Update '*var' according to the adagrad scheme.
ApplyAdagrad.Options Optional attributes for ApplyAdagrad
ApplyAdagradDa <T extends TType > Update '*var' according to the proximal adagrad scheme.
ApplyAdagradDa.Options Optional attributes for ApplyAdagradDa
ApplyAdagradV2 <T extends TType > Update '*var' according to the adagrad scheme.
ApplyAdagradV2.Options Optional attributes for ApplyAdagradV2
ApplyAdam <T extends TType > Update '*var' according to the Adam algorithm.
ApplyAdam.Options Optional attributes for ApplyAdam
ApplyAdaMax <T extends TType > Update '*var' according to the AdaMax algorithm.
ApplyAdaMax.Options Optional attributes for ApplyAdaMax
ApplyAddSign <T extends TType > Update '*var' according to the AddSign update.
ApplyAddSign.Options Optional attributes for ApplyAddSign
ApplyCenteredRmsProp <T extends TType > Update '*var' according to the centered RMSProp algorithm.
ApplyCenteredRmsProp.Options Optional attributes for ApplyCenteredRmsProp
ApplyFtrl <T extends TType > Update '*var' according to the Ftrl-proximal scheme.
ApplyFtrl.Options Optional attributes for ApplyFtrl
ApplyGradientDescent <T extends TType > Update '*var' by subtracting 'alpha' * 'delta' from it.
ApplyGradientDescent.Options Optional attributes for ApplyGradientDescent
ApplyMomentum <T extends TType > Update '*var' according to the momentum scheme.
ApplyMomentum.Options Optional attributes for ApplyMomentum
ApplyPowerSign <T extends TType > Update '*var' according to the AddSign update.
ApplyPowerSign.Options Optional attributes for ApplyPowerSign
ApplyProximalAdagrad <T extends TType > Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
ApplyProximalAdagrad.Options Optional attributes for ApplyProximalAdagrad
ApplyProximalGradientDescent <T extends TType > Update '*var' as FOBOS algorithm with fixed learning rate.
ApplyProximalGradientDescent.Options Optional attributes for ApplyProximalGradientDescent
ApplyRmsProp <T extends TType > Update '*var' according to the RMSProp algorithm.
ApplyRmsProp.Options Optional attributes for ApplyRmsProp
BatchMatMul <T extends TType > Multiplies slices of two tensors in batches.
BatchMatMul.Options Optional attributes for BatchMatMul
ComputeBatchSize Computes the static batch size of a dataset sans partial batches.
ConditionalAccumulator A conditional accumulator for aggregating gradients.
ConditionalAccumulator.Options Optional attributes for ConditionalAccumulator
GenerateVocabRemapping Given a path to new and old vocabulary files, returns a remapping Tensor of

length `num_new_vocab`, where `remapping[i]` contains the row number in the old vocabulary that corresponds to row `i` in the new vocabulary (starting at line `new_vocab_offset` and up to `num_new_vocab` entities), or `-1` if entry `i` in the new vocabulary is not in the old vocabulary.

GenerateVocabRemapping.Options Optional attributes for GenerateVocabRemapping
MergeV2Checkpoints V2 format specific: merges the metadata files of sharded checkpoints.
MergeV2Checkpoints.Options Optional attributes for MergeV2Checkpoints
NegTrain Training via negative sampling.
PreventGradient <T extends TType > An identity op that triggers an error if a gradient is requested.
PreventGradient.Options Optional attributes for PreventGradient
ResourceAccumulatorApplyGradient Applies a gradient to a given accumulator.
ResourceAccumulatorNumAccumulated Returns the number of gradients aggregated in the given accumulators.
ResourceAccumulatorSetGlobalStep Updates the accumulator with a new value for global_step.
ResourceAccumulatorTakeGradient <T extends TType > Extracts the average gradient in the given ConditionalAccumulator.
ResourceApplyAdadelta Update '*var' according to the adadelta scheme.
ResourceApplyAdadelta.Options Optional attributes for ResourceApplyAdadelta
ResourceApplyAdagrad Update '*var' according to the adagrad scheme.
ResourceApplyAdagrad.Options Optional attributes for ResourceApplyAdagrad
ResourceApplyAdagradDa Update '*var' according to the proximal adagrad scheme.
ResourceApplyAdagradDa.Options Optional attributes for ResourceApplyAdagradDa
ResourceApplyAdam Update '*var' according to the Adam algorithm.
ResourceApplyAdam.Options Optional attributes for ResourceApplyAdam
ResourceApplyAdaMax Update '*var' according to the AdaMax algorithm.
ResourceApplyAdaMax.Options Optional attributes for ResourceApplyAdaMax
ResourceApplyAdamWithAmsgrad Update '*var' according to the Adam algorithm.
ResourceApplyAdamWithAmsgrad.Options Optional attributes for ResourceApplyAdamWithAmsgrad
ResourceApplyAddSign Update '*var' according to the AddSign update.
ResourceApplyAddSign.Options Optional attributes for ResourceApplyAddSign
ResourceApplyCenteredRmsProp Update '*var' according to the centered RMSProp algorithm.
ResourceApplyCenteredRmsProp.Options Optional attributes for ResourceApplyCenteredRmsProp
ResourceApplyFtrl Update '*var' according to the Ftrl-proximal scheme.
ResourceApplyFtrl.Options Optional attributes for ResourceApplyFtrl
ResourceApplyGradientDescent Update '*var' by subtracting 'alpha' * 'delta' from it.
ResourceApplyGradientDescent.Options Optional attributes for ResourceApplyGradientDescent
ResourceApplyKerasMomentum Update '*var' according to the momentum scheme.
ResourceApplyKerasMomentum.Options Optional attributes for ResourceApplyKerasMomentum
ResourceApplyMomentum Update '*var' according to the momentum scheme.
ResourceApplyMomentum.Options Optional attributes for ResourceApplyMomentum
ResourceApplyPowerSign Update '*var' according to the AddSign update.
ResourceApplyPowerSign.Options Optional attributes for ResourceApplyPowerSign
ResourceApplyProximalAdagrad Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
ResourceApplyProximalAdagrad.Options Optional attributes for ResourceApplyProximalAdagrad
ResourceApplyProximalGradientDescent Update '*var' as FOBOS algorithm with fixed learning rate.
ResourceApplyProximalGradientDescent.Options Optional attributes for ResourceApplyProximalGradientDescent
ResourceApplyRmsProp Update '*var' according to the RMSProp algorithm.
ResourceApplyRmsProp.Options Optional attributes for ResourceApplyRmsProp
ResourceConditionalAccumulator A conditional accumulator for aggregating gradients.
ResourceConditionalAccumulator.Options Optional attributes for ResourceConditionalAccumulator
ResourceSparseApplyAdadelta var: Should be from a Variable().
ResourceSparseApplyAdadelta.Options Optional attributes for ResourceSparseApplyAdadelta
ResourceSparseApplyAdagrad Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
ResourceSparseApplyAdagrad.Options Optional attributes for ResourceSparseApplyAdagrad
ResourceSparseApplyAdagradDa Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
ResourceSparseApplyAdagradDa.Options Optional attributes for ResourceSparseApplyAdagradDa
ResourceSparseApplyAdagradV2 Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
ResourceSparseApplyAdagradV2.Options Optional attributes for ResourceSparseApplyAdagradV2
ResourceSparseApplyCenteredRmsProp Update '*var' according to the centered RMSProp algorithm.
ResourceSparseApplyCenteredRmsProp.Options Optional attributes for ResourceSparseApplyCenteredRmsProp
ResourceSparseApplyFtrl Update relevant entries in '*var' according to the Ftrl-proximal scheme.
ResourceSparseApplyFtrl.Options Optional attributes for ResourceSparseApplyFtrl
ResourceSparseApplyKerasMomentum Update relevant entries in '*var' and '*accum' according to the momentum scheme.
ResourceSparseApplyKerasMomentum.Options Optional attributes for ResourceSparseApplyKerasMomentum
ResourceSparseApplyMomentum Update relevant entries in '*var' and '*accum' according to the momentum scheme.
ResourceSparseApplyMomentum.Options Optional attributes for ResourceSparseApplyMomentum
ResourceSparseApplyProximalAdagrad Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
ResourceSparseApplyProximalAdagrad.Options Optional attributes for ResourceSparseApplyProximalAdagrad
ResourceSparseApplyProximalGradientDescent Sparse update '*var' as FOBOS algorithm with fixed learning rate.
ResourceSparseApplyProximalGradientDescent.Options Optional attributes for ResourceSparseApplyProximalGradientDescent
ResourceSparseApplyRmsProp Update '*var' according to the RMSProp algorithm.
ResourceSparseApplyRmsProp.Options Optional attributes for ResourceSparseApplyRmsProp
Restore Restores tensors from a V2 checkpoint.
RestoreSlice <T extends TType > Restores a tensor from checkpoint files.
RestoreSlice.Options Optional attributes for RestoreSlice
Save Saves tensors in V2 checkpoint format.
SaveSlices Saves input tensors slices to disk.
SdcaFprint Computes fingerprints of the input strings.
SdcaOptimizer Distributed version of Stochastic Dual Coordinate Ascent (SDCA) optimizer for

linear models with L1 + L2 regularization.

SdcaOptimizer.Options Optional attributes for SdcaOptimizer
SdcaShrinkL1 Applies L1 regularization shrink step on the parameters.
SparseApplyAdadelta <T extends TType > var: Should be from a Variable().
SparseApplyAdadelta.Options Optional attributes for SparseApplyAdadelta
SparseApplyAdagrad <T extends TType > Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
SparseApplyAdagrad.Options Optional attributes for SparseApplyAdagrad
SparseApplyAdagradDa <T extends TType > Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
SparseApplyAdagradDa.Options Optional attributes for SparseApplyAdagradDa
SparseApplyCenteredRmsProp <T extends TType > Update '*var' according to the centered RMSProp algorithm.
SparseApplyCenteredRmsProp.Options Optional attributes for SparseApplyCenteredRmsProp
SparseApplyFtrl <T extends TType > Update relevant entries in '*var' according to the Ftrl-proximal scheme.
SparseApplyFtrl.Options Optional attributes for SparseApplyFtrl
SparseApplyMomentum <T extends TType > Update relevant entries in '*var' and '*accum' according to the momentum scheme.
SparseApplyMomentum.Options Optional attributes for SparseApplyMomentum
SparseApplyProximalAdagrad <T extends TType > Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
SparseApplyProximalAdagrad.Options Optional attributes for SparseApplyProximalAdagrad
SparseApplyProximalGradientDescent <T extends TType > Sparse update '*var' as FOBOS algorithm with fixed learning rate.
SparseApplyProximalGradientDescent.Options Optional attributes for SparseApplyProximalGradientDescent
SparseApplyRmsProp <T extends TType > Update '*var' according to the RMSProp algorithm.
SparseApplyRmsProp.Options Optional attributes for SparseApplyRmsProp
TileGrad <T extends TType > Returns the gradient of `Tile`.