Although in many cases it's not necessary to understand all of the many ways
to configure a SavedModel, this method has a few practical implications:
It will be treated as a graph for inference / serving (i.e. uses the tag
saved_model.SERVING)
The SavedModel will load in TensorFlow Serving and supports the
Predict
API.
To use the Classify, Regress, or MultiInference APIs, please see the
SavedModel
APIs.
Some TensorFlow ops depend on information on disk or other information
called "assets". These are generally handled automatically by adding the
assets to the GraphKeys.ASSET_FILEPATHS collection. Only assets in that
collection are exported; if you need more custom behavior, you'll need to
use the
SavedModelBuilder.
[null,null,["Last updated 2024-04-26 UTC."],[],[],null,["# tf.compat.v1.saved_model.simple_save\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.16.1/tensorflow/python/saved_model/simple_save.py#L26-L84) |\n\nConvenience function to build a SavedModel suitable for serving. (deprecated) \n\n tf.compat.v1.saved_model.simple_save(\n session, export_dir, inputs, outputs, legacy_init_op=None\n )\n\n### Used in the notebooks\n\n| Used in the guide |\n|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| - [Migrating your TFLite code to TF2](https://www.tensorflow.org/guide/migrate/tflite) - [Migrate the SavedModel workflow](https://www.tensorflow.org/guide/migrate/saved_model) |\n\n| **Deprecated:** THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This API was designed for TensorFlow v1. See \u003chttps://www.tensorflow.org/guide/migrate\u003e for instructions on how to migrate your code to TensorFlow v2.\n\nIn many common cases, saving models for serving will be as simple as: \n\n simple_save(session,\n export_dir,\n inputs={\"x\": x, \"y\": y},\n outputs={\"z\": z})\n\nAlthough in many cases it's not necessary to understand all of the many ways\nto configure a SavedModel, this method has a few practical implications:\n\n- It will be treated as a graph for inference / serving (i.e. uses the tag [`saved_model.SERVING`](https://www.tensorflow.org/api_docs/python/tf/saved_model#SERVING))\n- The SavedModel will load in TensorFlow Serving and supports the [Predict\n API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). To use the Classify, Regress, or MultiInference APIs, please see the [SavedModel\n APIs](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).\n- Some TensorFlow ops depend on information on disk or other information called \"assets\". These are generally handled automatically by adding the assets to the `GraphKeys.ASSET_FILEPATHS` collection. Only assets in that collection are exported; if you need more custom behavior, you'll need to use the [SavedModelBuilder](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/builder.py).\n\nMore information about SavedModel and signatures can be found here:\n\u003chttps://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|------------------|--------------------------------------------------------------------------------------------------|\n| `session` | The TensorFlow session from which to save the meta graph and variables. |\n| `export_dir` | The path to which the SavedModel will be stored. |\n| `inputs` | dict mapping string input names to tensors. These are added to the SignatureDef as the inputs. |\n| `outputs` | dict mapping string output names to tensors. These are added to the SignatureDef as the outputs. |\n| `legacy_init_op` | Legacy support for op or group of ops to execute after the restore op upon a load. |\n\n\u003cbr /\u003e"]]