tf.lite.experimental.QuantizationDebugger

Debugger for Quantized TensorFlow Lite debug mode models.

Used in the notebooks

Used in the tutorials

This can run the TensorFlow Lite converted models equipped with debug ops and collect debug information. This debugger calculates statistics from user-defined post-processing functions as well as default ones.

quant_debug_model_path Path to the quantized debug TFLite model file.
quant_debug_model_content Content of the quantized debug TFLite model.
float_model_path Path to float TFLite model file.
float_model_content Content of the float TFLite model.
debug_dataset a factory function that returns dataset generator which is used to generate input samples (list of np.ndarray) for the model. The generated elements must have same types and shape as inputs to the model.
debug_options Debug options to debug the given model.
converter Optional, use converter instead of quantized model.

ValueError If the debugger was unable to be created.

options

Methods

get_debug_quantized_model

View source

Returns an instrumented quantized model.

Convert the quantized model with the initialized converter and return bytes for model. The model will be instrumented with numeric verification operations and should only be used for debugging.

Returns
Model bytes corresponding to the model.

Raises
ValueError if converter is not passed to the debugger.

get_nondebug_quantized_model

View source

Returns a non-instrumented quantized model.

Convert the quantized model with the initialized converter and return bytes for nondebug model. The model will not be instrumented with numeric verification operations.

Returns
Model bytes corresponding to the model.

Raises
ValueError if converter is not passed to the debugger.

layer_statistics_dump

View source

Dumps layer statistics into file, in csv format.

Args
file file, or file-like object to write.

run

View source

Runs models and gets metrics.