![]() |
Composite FeatureConnector
; each feature in dict
has its own connector.
Inherits From: FeatureConnector
tfds.features.FeaturesDict(
feature_dict
)
The encode/decode method of the spec feature will recursively encode/decode every sub-connector given on the constructor. Other features can inherit from this class and call super() in order to get nested container.
Example:
For DatasetInfo:
features = tfds.features.FeaturesDict({
'input': tfds.features.Image(),
'output': tf.int32,
})
At generation time:
for image, label in generate_examples:
yield {
'input': image,
'output': label
}
At tf.data.Dataset() time:
for example in tfds.load(...):
tf_input = example['input']
tf_output = example['output']
For nested features, the FeaturesDict will internally flatten the keys for the features and the conversion to tf.train.Example. Indeed, the tf.train.Example proto do not support nested feature, while tf.data.Dataset does. But internal transformation should be invisible to the user.
Example:
tfds.features.FeaturesDict({
'input': tf.int32,
'target': {
'height': tf.int32,
'width': tf.int32,
},
})
Will internally store the data as:
{
'input': tf.io.FixedLenFeature(shape=(), dtype=tf.int32),
'target/height': tf.io.FixedLenFeature(shape=(), dtype=tf.int32),
'target/width': tf.io.FixedLenFeature(shape=(), dtype=tf.int32),
}
Args | |
---|---|
feature_dict (dict): Dictionary containing the feature connectors of a example. The keys should correspond to the data dict as returned by tf.data.Dataset(). Types (tf.int32,...) and dicts will automatically be converted into FeatureConnector. |
Raises | |
---|---|
ValueError
|
If one of the given features is not recognized |
Attributes | |
---|---|
dtype
|
Return the dtype (or dict of dtype) of this FeatureConnector. |
shape
|
Return the shape (or dict of shape) of this FeatureConnector. |
Methods
decode_batch_example
decode_batch_example(
tfexample_data
)
Decode multiple features batched in a single tf.Tensor.
This function is used to decode features wrapped in
tfds.features.Sequence()
.
By default, this function apply decode_example
on each individual
elements using tf.map_fn
. However, for optimization, features can
overwrite this method to apply a custom batch decoding.
Args | |
---|---|
tfexample_data
|
Same tf.Tensor inputs as decode_example , but with
and additional first dimension for the sequence length.
|
Returns | |
---|---|
tensor_data
|
Tensor or dictionary of tensor, output of the tf.data.Dataset object |
decode_example
decode_example(
serialized_example, decoders=None
)
Decode the serialize examples.
Args | |
---|---|
serialized_example
|
Nested dict of tf.Tensor
|
decoders
|
Nested dict of Decoder objects which allow to customize the
decoding. The structure should match the feature structure, but only
customized feature keys need to be present. See
the guide
for more info.
|
Returns | |
---|---|
example
|
Nested dict containing the decoded nested examples.
|
decode_ragged_example
decode_ragged_example(
tfexample_data
)
Decode nested features from a tf.RaggedTensor.
This function is used to decode features wrapped in nested
tfds.features.Sequence()
.
By default, this function apply decode_batch_example
on the flat values
of the ragged tensor. For optimization, features can
overwrite this method to apply a custom batch decoding.
Args | |
---|---|
tfexample_data
|
tf.RaggedTensor inputs containing the nested encoded
examples.
|
Returns | |
---|---|
tensor_data
|
The decoded tf.RaggedTensor or dictionary of tensor,
output of the tf.data.Dataset object
|
encode_example
encode_example(
example_dict
)
See base class for details.
from_config
@classmethod
from_config( root_dir: str ) -> "FeatureConnector"
Reconstructs the FeatureConnector from the config file.
Usage:
features = FeatureConnector.from_config('path/to/features.json')
Args | |
---|---|
root_dir
|
Directory containing to the features.json file. |
Returns | |
---|---|
The reconstructed feature instance. |
from_json
@classmethod
from_json( value:
tfds.typing.Json
) -> "FeatureConnector"
FeatureConnector factory.
This function should be called from the tfds.features.FeatureConnector
base class. Subclass should implement the from_json_content
.
Example:
feature = tfds.features.FeatureConnector.from_json(
{'type': 'Image', 'content': {'shape': [32, 32, 3], 'dtype': 'uint8'} }
)
assert isinstance(feature, tfds.features.Image)
Args | |
---|---|
value
|
dict(type=, content=) containing the feature to restore.
Match dict returned by to_json .
|
Returns | |
---|---|
The reconstructed FeatureConnector. |
from_json_content
@classmethod
from_json_content( value:
tfds.typing.Json
) -> "FeaturesDict"
FeatureConnector factory (to overwrite).
Subclasses should overwritte this method. importing the feature connector from the config.
This function should not be called directly. FeatureConnector.from_json
should be called instead.
This function See existing FeatureConnector for example of implementation.
Args | |
---|---|
value
|
FeatureConnector information. Match the dict returned by
to_json_content .
|
Returns | |
---|---|
The reconstructed FeatureConnector. |
get_serialized_info
get_serialized_info()
See base class for details.
get_tensor_info
get_tensor_info()
See base class for details.
items
items()
keys
keys()
load_metadata
load_metadata(
data_dir, feature_name=None
)
See base class for details.
repr_html
repr_html(
ex: np.ndarray
) -> str
Returns the HTML str representation of the object.
repr_html_batch
repr_html_batch(
ex: np.ndarray
) -> str
Returns the HTML str representation of the object (Sequence).
repr_html_ragged
repr_html_ragged(
ex: np.ndarray
) -> str
Returns the HTML str representation of the object (Nested sequence).
save_config
save_config(
root_dir: str
) -> None
Exports the FeatureConnector
to a file.
Args | |
---|---|
root_dir
|
path/to/dir containing the features.json
|
save_metadata
save_metadata(
data_dir, feature_name=None
)
See base class for details.
to_json
to_json() -> tfds.typing.Json
Exports the FeatureConnector to Json.
Each feature is serialized as a dict(type=..., content=...)
.
type
: The cannonical name of the feature (module.FeatureName
).content
: is specific to each feature connector and defined into_json_content
. Can contain nested sub-features (like fortfds.features.FeaturesDict
andtfds.features.Sequence
).
For example:
tfds.features.FeaturesDict({
'input': tfds.features.Image(),
'target': tfds.features.ClassLabel(num_classes=10),
})
Is serialized as:
{
"type": "tensorflow_datasets.core.features.features_dict.FeaturesDict",
"content": {
"input": {
"type": "tensorflow_datasets.core.features.image_feature.Image",
"content": {
"shape": [null, null, 3],
"dtype": "uint8",
"encoding_format": "png"
}
},
"target": {
"type": "tensorflow_datasets.core.features.class_label_feature.ClassLabel",
"num_classes": 10
}
}
}
Returns | |
---|---|
A dict(type=, content=) . Will be forwarded to
from_json when reconstructing the feature.
|
to_json_content
to_json_content() -> tfds.typing.Json
FeatureConnector factory (to overwrite).
This function should be overwritten by the subclass to allow re-importing the feature connector from the config. See existing FeatureConnector for example of implementation.
Returns | |
---|---|
Dict containing the FeatureConnector metadata. Will be forwarded to
from_json_content when reconstructing the feature.
|
values
values()
__contains__
__contains__(
k
)
__getitem__
__getitem__(
key
)
Return the feature associated with the key.
__iter__
__iter__()
__len__
__len__()