tf.contrib.tpu.export_estimator_savedmodel
Export Estimator
trained model for TPU inference.
tf.contrib.tpu.export_estimator_savedmodel(
estimator, export_dir_base, serving_input_receiver_fn, assets_extra=None,
as_text=False, checkpoint_path=None
)
Args |
estimator
|
Estimator with which model has been trained.
|
export_dir_base
|
A string containing a directory in which to create
timestamped subdirectories containing exported SavedModels.
|
serving_input_receiver_fn
|
A function that takes no argument and returns a
ServingInputReceiver or TensorServingInputReceiver .
|
assets_extra
|
A dict specifying how to populate the assets.extra directory
within the exported SavedModel, or None if no extra assets are needed.
|
as_text
|
whether to write the SavedModel proto in text format.
|
checkpoint_path
|
The checkpoint path to export. If None (the default),
the most recent checkpoint found within the model directory is chosen.
|
Returns |
The string path to the exported directory.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[]]