tf.raw_ops.BroadcastTo
bookmark_borderbookmark
Stay organized with collections
Save and categorize content based on your preferences.
Broadcast an array for a compatible shape.
View aliases
Compat aliases for migration
See
Migration guide for
more details.
tf.compat.v1.raw_ops.BroadcastTo
tf.raw_ops.BroadcastTo(
input, shape, name=None
)
Broadcasting is the process of making arrays to have compatible shapes
for arithmetic operations. Two shapes are compatible if for each
dimension pair they are either equal or one of them is one.
For example:
x = tf.constant([[1, 2, 3]]) # Shape (1, 3,)
y = tf.broadcast_to(x, [2, 3])
print(y)
tf.Tensor(
[[1 2 3]
[1 2 3]], shape=(2, 3), dtype=int32)
In the above example, the input Tensor with the shape of [1, 3]
is broadcasted to output Tensor with shape of [2, 3]
.
When broadcasting, if a tensor has fewer axes than necessary its shape is
padded on the left with ones. So this gives the same result as the previous
example:
x = tf.constant([1, 2, 3]) # Shape (3,)
y = tf.broadcast_to(x, [2, 3])
When doing broadcasted operations such as multiplying a tensor
by a scalar, broadcasting (usually) confers some time or space
benefit, as the broadcasted tensor is never materialized.
However, broadcast_to
does not carry with it any such benefits.
The newly-created tensor takes the full memory of the broadcasted
shape. (In a graph context, broadcast_to
might be fused to
subsequent operation and then be optimized away, however.)
Args |
input
|
A Tensor . A Tensor to broadcast.
|
shape
|
A Tensor . Must be one of the following types: int32 , int64 .
An 1-D int Tensor. The shape of the desired output.
|
name
|
A name for the operation (optional).
|
Returns |
A Tensor . Has the same type as input .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-04-26 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-04-26 UTC."],[],[],null,["# tf.raw_ops.BroadcastTo\n\n\u003cbr /\u003e\n\nBroadcast an array for a compatible shape.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.raw_ops.BroadcastTo`](https://www.tensorflow.org/api_docs/python/tf/raw_ops/BroadcastTo)\n\n\u003cbr /\u003e\n\n tf.raw_ops.BroadcastTo(\n input, shape, name=None\n )\n\nBroadcasting is the process of making arrays to have compatible shapes\nfor arithmetic operations. Two shapes are compatible if for each\ndimension pair they are either equal or one of them is one.\n\n#### For example:\n\n x = tf.constant([[1, 2, 3]]) # Shape (1, 3,)\n y = tf.broadcast_to(x, [2, 3])\n print(y)\n tf.Tensor(\n [[1 2 3]\n [1 2 3]], shape=(2, 3), dtype=int32)\n\nIn the above example, the input Tensor with the shape of `[1, 3]`\nis broadcasted to output Tensor with shape of `[2, 3]`.\n\nWhen broadcasting, if a tensor has fewer axes than necessary its shape is\npadded on the left with ones. So this gives the same result as the previous\nexample: \n\n x = tf.constant([1, 2, 3]) # Shape (3,)\n y = tf.broadcast_to(x, [2, 3])\n\nWhen doing broadcasted operations such as multiplying a tensor\nby a scalar, broadcasting (usually) confers some time or space\nbenefit, as the broadcasted tensor is never materialized.\n\nHowever, `broadcast_to` does not carry with it any such benefits.\nThe newly-created tensor takes the full memory of the broadcasted\nshape. (In a graph context, `broadcast_to` might be fused to\nsubsequent operation and then be optimized away, however.)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|---------|-------------------------------------------------------------------------------------------------------------------------|\n| `input` | A `Tensor`. A Tensor to broadcast. |\n| `shape` | A `Tensor`. Must be one of the following types: `int32`, `int64`. An 1-D `int` Tensor. The shape of the desired output. |\n| `name` | A name for the operation (optional). |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A `Tensor`. Has the same type as `input`. ||\n\n\u003cbr /\u003e"]]