tfp.experimental.auto_batching.type_inference.is_inferring
Stay organized with collections
Save and categorize content based on your preferences.
Returns whether type inference is running.
tfp.experimental.auto_batching.type_inference.is_inferring()
This can be useful for writing special primitives that change their behavior
depending on whether they are being inferred, staged (see
virtual_machine.is_staging
), or neither (i.e., dry-run execution, see
frontend.Context.batch
).
Returns |
inferring
|
Python bool , True if this is called in the dynamic scope of
type inference, otherwise False .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-11-21 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-11-21 UTC."],[],[]]