tfmot.quantization.keras.quantize_scope
Stay organized with collections
Save and categorize content based on your preferences.
Scope which can be used to deserialize quantized Keras models and layers.
tfmot.quantization.keras.quantize_scope(
*args
)
Used in the notebooks
Under quantize_scope
, Keras methods such as tf.keras.load_model
and
tf.keras.models.model_from_config
will be able to deserialize Keras models
and layers which contain quantization classes such as QuantizeConfig
and Quantizer
.
Example:
tf.keras.models.save_model(quantized_model, keras_file)
with quantize_scope():
loaded_model = tf.keras.models.load_model(keras_file)
# If your quantized model uses custom objects such as a specific `Quantizer`,
# you can pass them to quantize_scope to deserialize your model.
with quantize_scope({'FixedRangeQuantizer', FixedRangeQuantizer}
loaded_model = tf.keras.models.load_model(keras_file)
For further understanding, see tf.keras.utils.custom_object_scope
.
Args |
*args
|
Variable length list of dictionaries of {name, class} pairs to add
to the scope created by this method.
|
Returns |
Object of type CustomObjectScope with quantization objects included.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-05-26 UTC.
[null,null,["Last updated 2023-05-26 UTC."],[],[],null,["# tfmot.quantization.keras.quantize_scope\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/model-optimization/blob/v0.7.5/tensorflow_model_optimization/python/core/quantization/keras/quantize.py#L35-L81) |\n\nScope which can be used to deserialize quantized Keras models and layers. \n\n tfmot.quantization.keras.quantize_scope(\n *args\n )\n\n### Used in the notebooks\n\n| Used in the guide |\n|----------------------------------------------------------------------------------------------------------------------------------------------------|\n| - [Quantization aware training comprehensive guide](https://www.tensorflow.org/model_optimization/guide/quantization/training_comprehensive_guide) |\n\nUnder `quantize_scope`, Keras methods such as `tf.keras.load_model` and\n[`tf.keras.models.model_from_config`](https://www.tensorflow.org/api_docs/python/tf/keras/models/model_from_config) will be able to deserialize Keras models\nand layers which contain quantization classes such as `QuantizeConfig`\nand `Quantizer`.\n\n#### Example:\n\n tf.keras.models.save_model(quantized_model, keras_file)\n\n with quantize_scope():\n loaded_model = tf.keras.models.load_model(keras_file)\n\n # If your quantized model uses custom objects such as a specific `Quantizer`,\n # you can pass them to quantize_scope to deserialize your model.\n with quantize_scope({'FixedRangeQuantizer', FixedRangeQuantizer}\n loaded_model = tf.keras.models.load_model(keras_file)\n\nFor further understanding, see [`tf.keras.utils.custom_object_scope`](https://www.tensorflow.org/api_docs/python/tf/keras/saving/custom_object_scope).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|---------|-----------------------------------------------------------------------------------------------------------|\n| `*args` | Variable length list of dictionaries of `{name, class}` pairs to add to the scope created by this method. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Object of type `CustomObjectScope` with quantization objects included. ||\n\n\u003cbr /\u003e"]]