![]() |
Bulk inferer executor for inference on AI Platform.
Inherits From: Executor
, BaseExecutor
tfx.extensions.google_cloud_ai_platform.bulk_inferrer.executor.Executor(
context: Optional[tfx.dsl.components.base.base_executor.BaseExecutor.Context
] = None
)
Child Classes
Methods
Do
Do(
input_dict: Dict[Text, List[types.Artifact]],
output_dict: Dict[Text, List[types.Artifact]],
exec_properties: Dict[Text, Any]
) -> None
Runs batch inference on a given model with given input examples.
This function creates a new model (if necessary) and a new model version before inference, and cleans up resources after inference. It provides re-executability as it cleans up (only) the model resources that are created during the process even inference job failed.
Args | |
---|---|
input_dict
|
Input dict from input key to a list of Artifacts.
|
output_dict
|
Output dict from output key to a list of Artifacts.
|
exec_properties
|
A dict of execution properties.
|
Returns | |
---|---|
None |