此页面由 Cloud Translation API 翻译。
Switch to English

培训师TFX管道组件

Trainer TFX管道组件训练TensorFlow模型。

培训师和TensorFlow

Trainer广泛使用Python TensorFlow API来训练模型。

零件

培训师需要:

  • tf。用于培训和评估的示例。
  • 用户提供的模块文件,用于定义训练器逻辑。
  • 由SchemaGen管道组件创建并由开发人员可选地更改的数据模式。
  • 程序 arg和eval arg的Protobuf定义。
  • (可选)由上游Transform组件生成的变换图。
  • (可选)用于热启动等场景的预训练模型。
  • (可选)超参数,这些参数将传递给用户模块功能。与Tuner集成的详细信息可以在这里找到。

Trainer发出:至少一个用于推理/服务的模型(通常在SavedModelFormat中),以及可选的另一个用于eval的模型(通常是EvalSavedModel)。

我们通过“ 模型重写库”为其他模型格式提供支持,例如TFLite 。有关如何同时转换Estimator和Keras模型的示例,请参见模型重写库的链接。

基于估计的培训师

要了解有关将基于Estimator的模型与TFX和Trainer结合使用的信息,请参阅使用tf.Estimator for TFX设计TensorFlow建模代码

配置培训师组件

典型的管道Python DSL代码如下所示:

from tfx.components import Trainer

...

trainer = Trainer(
      module_file=module_file,
      examples=transform.outputs['transformed_examples'],
      schema=infer_schema.outputs['schema'],
      base_models=latest_model_resolver.outputs['latest_model'],
      transform_graph=transform.outputs['transform_graph'],
      train_args=trainer_pb2.TrainArgs(num_steps=10000),
      eval_args=trainer_pb2.EvalArgs(num_steps=5000))

培训师调用一个培训模块,该模块在module_file参数中指定。一个典型的培训模块如下所示:

# TFX will call this function
def trainer_fn(trainer_fn_args, schema):
  """Build the estimator using the high level API.

  Args:
    trainer_fn_args: Holds args used to train the model as name/value pairs.
    schema: Holds the schema of the training examples.

  Returns:
    A dict of the following:

      - estimator: The estimator that will be used for training and eval.
      - train_spec: Spec for training.
      - eval_spec: Spec for eval.
      - eval_input_receiver_fn: Input function for eval.
  """
  # Number of nodes in the first layer of the DNN
  first_dnn_layer_size = 100
  num_dnn_layers = 4
  dnn_decay_factor = 0.7

  train_batch_size = 40
  eval_batch_size = 40

  tf_transform_output = tft.TFTransformOutput(trainer_fn_args.transform_output)

  train_input_fn = lambda: _input_fn(  # pylint: disable=g-long-lambda
      trainer_fn_args.train_files,
      tf_transform_output,
      batch_size=train_batch_size)

  eval_input_fn = lambda: _input_fn(  # pylint: disable=g-long-lambda
      trainer_fn_args.eval_files,
      tf_transform_output,
      batch_size=eval_batch_size)

  train_spec = tf.estimator.TrainSpec(  # pylint: disable=g-long-lambda
      train_input_fn,
      max_steps=trainer_fn_args.train_steps)

  serving_receiver_fn = lambda: _example_serving_receiver_fn(  # pylint: disable=g-long-lambda
      tf_transform_output, schema)

  exporter = tf.estimator.FinalExporter('chicago-taxi', serving_receiver_fn)
  eval_spec = tf.estimator.EvalSpec(
      eval_input_fn,
      steps=trainer_fn_args.eval_steps,
      exporters=[exporter],
      name='chicago-taxi-eval')

  run_config = tf.estimator.RunConfig(
      save_checkpoints_steps=999, keep_checkpoint_max=1)

  run_config = run_config.replace(model_dir=trainer_fn_args.serving_model_dir)
  warm_start_from = trainer_fn_args.base_models[
      0] if trainer_fn_args.base_models else None

  estimator = _build_estimator(
      # Construct layers sizes with exponetial decay
      hidden_units=[
          max(2, int(first_dnn_layer_size * dnn_decay_factor**i))
          for i in range(num_dnn_layers)
      ],
      config=run_config,
      warm_start_from=warm_start_from)

  # Create an input receiver for TFMA processing
  receiver_fn = lambda: _eval_input_receiver_fn(  # pylint: disable=g-long-lambda
      tf_transform_output, schema)

  return {
      'estimator': estimator,
      'train_spec': train_spec,
      'eval_spec': eval_spec,
      'eval_input_receiver_fn': receiver_fn
  }

通用教练

通用训练器使开发人员可以将任何TensorFlow模型API与训练器组件一起使用。除了TensorFlow Estimators,开发人员还可以使用Keras模型或自定义训练循环。有关详细信息,请参阅通用培训师RFC

配置培训师组件以使用GenericExecutor

通用Trainer的典型管道DSL代码如下所示:

from tfx.components import Trainer
from tfx.components.base import executor_spec
from tfx.components.trainer.executor import GenericExecutor

...

trainer = Trainer(
    module_file=module_file,
    custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor),
    examples=transform.outputs['transformed_examples'],
    transform_graph=transform.outputs['transform_graph'],
    schema=infer_schema.outputs['schema'],
    train_args=trainer_pb2.TrainArgs(num_steps=10000),
    eval_args=trainer_pb2.EvalArgs(num_steps=5000))

培训师调用一个培训模块,该模块在module_file参数中指定。取而代之的trainer_fn ,一个run_fn在模块文件要求,如果GenericExecutor在指定custom_executor_spec

如果管道中未使用Transform组件,那么Trainer将直接从ExampleGen中获取示例:

trainer = Trainer(
    module_file=module_file,
    custom_executor_spec=executor_spec.ExecutorClassSpec(GenericExecutor),
    examples=example_gen.outputs['examples'],
    schema=infer_schema.outputs['schema'],
    train_args=trainer_pb2.TrainArgs(num_steps=10000),
    eval_args=trainer_pb2.EvalArgs(num_steps=5000))

这是带有run_fn示例模块文件