tf.keras.layers.experimental.preprocessing.RandomCrop
bookmark_borderbookmark
Stay organized with collections
Save and categorize content based on your preferences.
Randomly crop the images to target height and width.
Inherits From: PreprocessingLayer
, Layer
, Module
View aliases
Compat aliases for migration
See
Migration guide for
more details.
tf.compat.v1.keras.layers.experimental.preprocessing.RandomCrop
tf.keras.layers.experimental.preprocessing.RandomCrop(
height, width, seed=None, name=None, **kwargs
)
This layer will crop all the images in the same batch to the same cropping
location.
By default, random cropping is only applied during training. At inference
time, the images will be first rescaled to preserve the shorter side, and
center cropped. If you need to apply random cropping at inference time,
set training
to True when calling the layer.
4D tensor with shape:
(samples, height, width, channels)
, data_format='channels_last'.
Output shape:
4D tensor with shape:
(samples, target_height, target_width, channels)
.
Arguments |
height
|
Integer, the height of the output shape.
|
width
|
Integer, the width of the output shape.
|
seed
|
Integer. Used to create a random seed.
|
name
|
A string, the name of the layer.
|
Methods
adapt
View source
adapt(
data, reset_state=True
)
Fits the state of the preprocessing layer to the data being passed.
Arguments |
data
|
The data to train on. It can be passed either as a tf.data
Dataset, or as a numpy array.
|
reset_state
|
Optional argument specifying whether to clear the state of
the layer at the start of the call to adapt , or whether to start
from the existing state. This argument may not be relevant to all
preprocessing layers: a subclass of PreprocessingLayer may choose to
throw if 'reset_state' is set to False.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2021-02-18 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-02-18 UTC."],[],[],null,["# tf.keras.layers.experimental.preprocessing.RandomCrop\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.4.0/tensorflow/python/keras/layers/preprocessing/image_preprocessing.py#L193-L297) |\n\nRandomly crop the images to target height and width.\n\nInherits From: [`PreprocessingLayer`](../../../../../tf/keras/layers/experimental/preprocessing/PreprocessingLayer), [`Layer`](../../../../../tf/keras/layers/Layer), [`Module`](../../../../../tf/Module)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.experimental.preprocessing.RandomCrop`](https://www.tensorflow.org/api_docs/python/tf/keras/layers/experimental/preprocessing/RandomCrop)\n\n\u003cbr /\u003e\n\n tf.keras.layers.experimental.preprocessing.RandomCrop(\n height, width, seed=None, name=None, **kwargs\n )\n\nThis layer will crop all the images in the same batch to the same cropping\nlocation.\nBy default, random cropping is only applied during training. At inference\ntime, the images will be first rescaled to preserve the shorter side, and\ncenter cropped. If you need to apply random cropping at inference time,\nset `training` to True when calling the layer.\n\n#### Input shape:\n\n4D tensor with shape:\n`(samples, height, width, channels)`, data_format='channels_last'.\n\n#### Output shape:\n\n4D tensor with shape:\n`(samples, target_height, target_width, channels)`.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|----------|------------------------------------------|\n| `height` | Integer, the height of the output shape. |\n| `width` | Integer, the width of the output shape. |\n| `seed` | Integer. Used to create a random seed. |\n| `name` | A string, the name of the layer. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `adapt`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.4.0/tensorflow/python/keras/engine/base_preprocessing_layer.py#L53-L66) \n\n adapt(\n data, reset_state=True\n )\n\nFits the state of the preprocessing layer to the data being passed.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments ||\n|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `data` | The data to train on. It can be passed either as a tf.data Dataset, or as a numpy array. |\n| `reset_state` | Optional argument specifying whether to clear the state of the layer at the start of the call to `adapt`, or whether to start from the existing state. This argument may not be relevant to all preprocessing layers: a subclass of PreprocessingLayer may choose to throw if 'reset_state' is set to False. |\n\n\u003cbr /\u003e"]]