tf.contrib.learn.read_batch_features(file_pattern, batch_size, features, reader, randomize_input=True, num_epochs=None, queue_capacity=10000, feature_queue_capacity=100, reader_num_threads=1, parse_fn=None, name=None)
See the guide: Learn (contrib) > Input processing
Adds operations to read, queue, batch and parse
Given file pattern (or list of files), will setup a queue for file names,
Example proto using provided
reader, use batch queue to create
batches of examples of size
batch_size and parse example given
All queue runners are added to the queue runners collection, and may be
All ops are added to the default graph.
file_pattern: List of files or pattern of file paths containing
tf.gfile.Globfor pattern rules.
batch_size: An int or scalar
Tensorspecifying the batch size to use.
dictmapping feature keys to
reader: A function or class that returns an object with
readmethod, (filename tensor) -> (example tensor).
randomize_input: Whether the input should be randomized.
num_epochs: Integer specifying the number of times to read through the dataset. If None, cycles through the dataset forever. NOTE - If specified, creates a variable that must be initialized, so call tf.local_variables_initializer() as shown in the tests.
queue_capacity: Capacity for input queue.
feature_queue_capacity: Capacity of the parsed features queue. Set this value to a small number, for example 5 if the parsed features are large.
reader_num_threads: The number of threads to read examples.
parse_fn: Parsing function, takes
ExampleTensor returns parsed representation. If
None, no parsing is done.
name: Name of resulting op.
A dict of
SparseTensor objects for each in
ValueError: for invalid inputs.