Module: tfmot.quantization.keras.default_8bit.default_8bit_transforms

Module containing 8bit default transforms.

Classes

class ConcatTransform: Transform for Concatenate. Quantize only after concatenation.

class ConcatTransform3Inputs: Transform for Concatenate. Quantize only after concatenation.

class ConcatTransform4Inputs: Transform for Concatenate. Quantize only after concatenation.

class ConcatTransform5Inputs: Transform for Concatenate. Quantize only after concatenation.

class ConcatTransform6Inputs: Transform for Concatenate. Quantize only after concatenation.

class Conv2DBatchNormActivationQuantize: Transform to be applied to "Conv2D" + "BatchNorm" + "ReLU" Graph.

class Conv2DBatchNormQuantize: Transform to be applied to "Conv2D" + "BatchNorm" Graph.

class Conv2DBatchNormReLUQuantize: Transform to be applied to "Conv2D" + "BatchNorm" + "ReLU" Graph.

class Conv2DReshapeBatchNormActivationQuantize: Transform to be applied to "Conv2D" + "Reshape" + "BatchNorm" + "ReLU" Graph.

class Conv2DReshapeBatchNormQuantize: Transform to be applied to "Conv2D" + "Reshape" + "BatchNorm" Graph.

class Conv2DReshapeBatchNormReLUQuantize: Transform to be applied to "Conv2D" + "Reshape" + "BatchNorm" + "ReLU" Graph.

class InputLayerQuantize: Quantizes InputLayer, by adding QuantizeLayer after it.

class LayerReLUQuantize: Transform to be applied to "Add"+ "ReLU" Graph.

class LayerReluActivationQuantize: Transform to be applied to "Add"+ "ReLU" Graph.

class SeparableConv1DQuantize: Add QAT support for Keras SeparableConv1D layer.

class SeparableConvQuantize: Break SeparableConv into a DepthwiseConv and Conv layer.