tf.contrib.learn.evaluate( graph, output_dir, checkpoint_path, eval_dict, update_op=None, global_step_tensor=None, supervisor_master='', log_every_steps=10, feed_fn=None, max_steps=None )
Evaluate a model loaded from a checkpoint. (deprecated)
graph, a directory to write summaries to (
output_dir), a checkpoint
to restore variables from, and a
Tensors to evaluate, run an eval
max_steps steps, or until an exception (generally, an
end-of-input signal from a reader operation) is raised from running
In each step of evaluation, all tensors in the
eval_dict are evaluated, and
log_every_steps steps, they are logged. At the very end of evaluation,
a summary is evaluated (finding the summary ops using
and written to
Graphto train. It is expected that this graph is not in use elsewhere.
output_dir: A string containing the directory to write a summary to.
checkpoint_path: A string containing the path to a checkpoint to restore. Can be
Noneif the graph doesn't require loading any variables.
dictmapping string names to tensors to evaluate. It is evaluated in every logging step. The result of the final evaluation is returned. If
update_opis None, then it's evaluated in every step. If
None, this should depend on a reader that will raise an end-of-input exception when the inputs are exhausted.
Tensorwhich is run in every step.
Variablecontaining the global step. If
None, one is extracted from the graph using the same logic as in
Supervisor. Used to place eval summaries on training curves.
supervisor_master: The master string to use when preparing the session.
log_every_steps: Integer. Output logs every
log_every_stepsevaluation steps. The logs contain the
eval_dictand timing information.
feed_fn: A function that is called every iteration to produce a
max_steps: Integer. Evaluate
eval_dictthis many times.
string to numeric values (
that are the result of running eval_dict in the last step.
None if no
eval steps were run.
global_step: The global step this evaluation corresponds to.