Missed TensorFlow Dev Summit? Check out the video playlist. Watch recordings

tf_agents.utils.numpy_storage.NumpyStorage

View source on GitHub

A class to store nested objects in a collection of numpy arrays.

tf_agents.utils.numpy_storage.NumpyStorage(
    data_spec, capacity
)

If a data_spec of {'foo': ArraySpec(shape=(4,), dtype=np.uint8), 'bar': ArraySpec(shape=(3, 7), dtype=np.float32)} were used, then this would create two arrays, one for the 'foo' key and one for the 'bar' key. The .get and .set methods would return/take Python dictionaries, but break down the component arrays before storing them.

Args:

  • data_spec: An ArraySpec or a list/tuple/nest of ArraySpecs describing a single item that can be stored in this table.
  • capacity: The maximum number of items that can be stored in the buffer.

Attributes:

  • name: Returns the name of this module as passed or determined in the ctor.

    NOTE: This is not the same as the self.name_scope.name which includes parent module names.

  • name_scope: Returns a tf.name_scope instance for this class.

  • submodules: Sequence of all sub-modules.

    Submodules are modules which are properties of this module, or found as properties of modules which are properties of this module (and so on).

a = tf.Module()
b = tf.Module()
c = tf.Module()
a.b = b
b.c = c
assert list(a.submodules) == [b, c]
assert list(b.submodules) == [c]
assert list(c.submodules) == []
  • trainable_variables: Sequence of trainable variables owned by this module and its submodules.

  • variables: Sequence of variables owned by this module and its submodules.

Raises:

  • ValueError: If data_spec is not an instance or nest of ArraySpecs.

Methods

get

View source

get(
    idx
)

Get value stored at idx.

set

View source

set(
    table_idx, value
)

Set table_idx to value.

with_name_scope

@classmethod
with_name_scope(
    cls, method
)

Decorator to automatically enter the module name scope.

class MyModule(tf.Module):
  @tf.Module.with_name_scope
  def __call__(self, x):
    if not hasattr(self, 'w'):
      self.w = tf.Variable(tf.random.normal([x.shape[1], 64]))
    return tf.matmul(x, self.w)

Using the above module would produce tf.Variables and tf.Tensors whose names included the module name:

mod = MyModule()
mod(tf.ones([8, 32]))
# ==> <tf.Tensor: ...>
mod.w
# ==> <tf.Variable ...'my_module/w:0'>

Args:

  • method: The method to wrap.

Returns:

The original method wrapped such that it enters the module's name scope.