Optimize further
Stay organized with collections
Save and categorize content based on your preferences.
When pre-optimized models and post-training tools do not satisfy your use case,
the next step is to try the different training-time tools.
Training time tools piggyback on the model's loss function over the training
data such that the model can "adapt" to the changes brought by the optimization
technique.
The starting point to use our training APIs is a Keras training script, which
can be optionally initialized from a pre-trained Keras model to further fine
tune.
Training time tools available for you to try:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2022-08-03 UTC.
[null,null,["Last updated 2022-08-03 UTC."],[],[],null,["# Optimize further\n\n\u003cbr /\u003e\n\nWhen pre-optimized models and post-training tools do not satisfy your use case,\nthe next step is to try the different training-time tools.\n\nTraining time tools piggyback on the model's loss function over the training\ndata such that the model can \"adapt\" to the changes brought by the optimization\ntechnique.\n\nThe starting point to use our training APIs is a Keras training script, which\ncan be optionally initialized from a pre-trained Keras model to further fine\ntune.\n\nTraining time tools available for you to try:\n\n- [Weight pruning](./pruning/)\n- [Quantization](./quantization/training)\n- [Weight clustering](./clustering/)\n- [Collaborative optimization](./combine/collaborative_optimization)"]]