Help protect the Great Barrier Reef with TensorFlow on Kaggle Join Challenge

Module: tfmot.quantization.keras

Module containing quantization code built on Keras abstractions.

Modules

collaborative_optimizations module: Module containing collaborative optimization code.

default_8bit module: Module containing 8bit default quantization scheme.

graph_transformations module: Module containing code for graph transformation.

quantizers module: Module containing Quantization abstraction and quantizers.

Classes

class QuantizeConfig: ABC interface for Keras layers to express how they should be quantized.

class QuantizeLayoutTransform: Apply transformations to the model.

class QuantizeRegistry: ABC interface which specifies how layers should be quantized.

class QuantizeScheme: ABC interface which specifies transformer and quantization registry.

class QuantizeWrapper: Quantizes the weights and activations of the keras layer it wraps.

class QuantizeWrapperV2: Quantizes the weights and activations of the keras layer it wraps.

Functions

quantize_annotate_layer(...): Annotate a tf.keras layer to be quantized.

quantize_annotate_model(...): Annotate a tf.keras model to be quantized.

quantize_apply(...)

quantize_model(...): Quantize a tf.keras model with the default quantization implementation.

quantize_scope(...): Scope which can be used to deserialize quantized Keras models and layers.