View source on GitHub |
Neural Architecture Search (NAS) recurrent network cell.
Inherits From: LayerRNNCell
tf.contrib.rnn.NASCell(
num_units, num_proj=None, use_bias=False, reuse=None, **kwargs
)
This implements the recurrent cell from the paper:
https://arxiv.org/abs/1611.01578
Barret Zoph and Quoc V. Le. "Neural Architecture Search with Reinforcement Learning" Proc. ICLR 2017.
The class uses an optional projection layer.
Args | |
---|---|
num_units
|
int, The number of units in the NAS cell. |
num_proj
|
(optional) int, The output dimensionality for the projection matrices. If None, no projection is performed. |
use_bias
|
(optional) bool, If True then use biases within the cell. This is False by default. |
reuse
|
(optional) Python boolean describing whether to reuse variables
in an existing scope. If not True , and the existing scope already has
the given variables, an error is raised.
|
**kwargs
|
Additional keyword arguments. |
Attributes | |
---|---|
graph
|
DEPRECATED FUNCTION |
output_size
|
Integer or TensorShape: size of outputs produced by this cell. |
scope_name
|
|
state_size
|
size(s) of state(s) used by this cell.
It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes. |
Methods
get_initial_state
get_initial_state(
inputs=None, batch_size=None, dtype=None
)
zero_state
zero_state(
batch_size, dtype
)
Return zero-filled state tensor(s).
Args | |
---|---|
batch_size
|
int, float, or unit Tensor representing the batch size. |
dtype
|
the data type to use for the state. |
Returns | |
---|---|
If state_size is an int or TensorShape, then the return value is a
N-D tensor of shape [batch_size, state_size] filled with zeros.
If |