|View source on GitHub|
Ensure FQ does not get placed between Add and ReLU.
tfmot.quantization.keras.experimental.default_n_bit.default_n_bit_transforms.LayerReluActivationQuantize( num_bits_weight: int = 8, num_bits_activation: int = 8 )
Dictionary of custom objects introduced by the
Transform may introduce custom Classes and types unknown to Keras. This
function should return a dictionary containing these objects in case such
types are introduced. It allows model construction to serialize/deserialize
|Custom objects introduced by the transform as a dictionary.|
LayerPattern to find in the model graph.
replacement( match_layer )
Generate a replacement sub-graph for the matched sub-graph.
The fundamental constraint of the replacement is that the replacement sub-graph should consume the same input tensors as the original sub-graph and also produce a final list of tensors which are same in number and shape as the original sub-graph. Not following this could crash model creation, or introduce bugs in the new model graph.
sub-graph, and output layers feeding from the tip of the tree as parameters. These would be needed for complex replace cases.
Matched sub-graph based on