TensorFlow provides a placeholder operation that must be fed with data on execution. For more info, see the section on Feeding data.
SparseTensors which are composite type,
there is a convenience function:
TensorFlow provides a set of Reader classes for reading data formats. For more information on inputs and readers, see Reading data.
TensorFlow provides several operations that you can use to convert various data formats into tensors.
Example protocol buffer
TensorFlow provides several implementations of 'Queues', which are structures within the TensorFlow computation graph to stage pipelines of tensors together. The following describe the basic Queue interface and some implementations. To see an example use, see Threading and Queues.
Dealing with the filesystem
TensorFlow functions for setting up an input-prefetching pipeline. Please see the reading data how-to for context.
Beginning of an input pipeline
The "producer" functions add a queue to the graph and a corresponding
QueueRunner for running the subgraph that fills that queue.
Batching at the end of an input pipeline
These functions add a queue to the graph to assemble a batch of
examples, with possible shuffling. They also add a
running the subgraph that fills that queue.
tf.train.batch_join for batching
examples that have already been well shuffled. Use
tf.train.shuffle_batch_join for examples that would
benefit from additional shuffling.
tf.train.shuffle_batch if you want a
single thread producing examples to batch, or if you have a
single subgraph producing examples but you want to run it in N threads
(where you increase N until it can keep the queue full). Use
if you have N different subgraphs producing examples to batch and you
want them run by N threads. Use
maybe_* to enqueue conditionally.