Although in many cases it's not necessary to understand all of the many ways
to configure a SavedModel, this method has a few practical implications:
It will be treated as a graph for inference / serving (i.e. uses the tag
saved_model.SERVING)
The SavedModel will load in TensorFlow Serving and supports the
Predict
API.
To use the Classify, Regress, or MultiInference APIs, please
use either
tf.Estimator
or the lower level
SavedModel
APIs.
Some TensorFlow ops depend on information on disk or other information
called "assets". These are generally handled automatically by adding the
assets to the GraphKeys.ASSET_FILEPATHS collection. Only assets in that
collection are exported; if you need more custom behavior, you'll need to
use the
SavedModelBuilder.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.saved_model.simple_save\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/python/saved_model/simple_save.py#L30-L91) |\n\nConvenience function to build a SavedModel suitable for serving. (deprecated)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.saved_model.simple_save`](/api_docs/python/tf/compat/v1/saved_model/simple_save)\n\n\u003cbr /\u003e\n\n tf.saved_model.simple_save(\n session, export_dir, inputs, outputs, legacy_init_op=None\n )\n\n| **Warning:** THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.simple_save.\n\nIn many common cases, saving models for serving will be as simple as: \n\n simple_save(session,\n export_dir,\n inputs={\"x\": x, \"y\": y},\n outputs={\"z\": z})\n\nAlthough in many cases it's not necessary to understand all of the many ways\nto configure a SavedModel, this method has a few practical implications:\n\n- It will be treated as a graph for inference / serving (i.e. uses the tag [`saved_model.SERVING`](../../tf/saved_model#SERVING))\n- The SavedModel will load in TensorFlow Serving and supports the [Predict\n API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). To use the Classify, Regress, or MultiInference APIs, please use either [tf.Estimator](https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator) or the lower level [SavedModel\n APIs](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md).\n- Some TensorFlow ops depend on information on disk or other information called \"assets\". These are generally handled automatically by adding the assets to the [`GraphKeys.ASSET_FILEPATHS`](../../tf/GraphKeys#ASSET_FILEPATHS) collection. Only assets in that collection are exported; if you need more custom behavior, you'll need to use the [SavedModelBuilder](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/builder.py).\n\nMore information about SavedModel and signatures can be found here:\n\u003chttps://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|------------------|--------------------------------------------------------------------------------------------------|\n| `session` | The TensorFlow session from which to save the meta graph and variables. |\n| `export_dir` | The path to which the SavedModel will be stored. |\n| `inputs` | dict mapping string input names to tensors. These are added to the SignatureDef as the inputs. |\n| `outputs` | dict mapping string output names to tensors. These are added to the SignatureDef as the outputs. |\n| `legacy_init_op` | Legacy support for op or group of ops to execute after the restore op upon a load. |\n\n\u003cbr /\u003e"]]