يجب أن تستهلك طبقة الإدخال الخاصة بالنموذج الخاص بك من SavedModel الذي تم إنشاؤه بواسطة مكون التحويل ، ويجب تضمين طبقات نموذج التحويل مع النموذج الخاص بك بحيث عندما تقوم بتصدير SavedModel وEvalSavedModel، فإنها ستتضمن التحويلات التي تم إنشاؤها بواسطة التحويل عنصر.
يبدو تصميم نموذج TensorFlow النموذجي لـ TFX كما يلي:
def_build_estimator(tf_transform_dir,config,hidden_units=None,warm_start_from=None):"""Build an estimator for predicting the tipping behavior of taxi riders. Args: tf_transform_dir: directory in which the tf-transform model was written during the preprocessing step. config: tf.contrib.learn.RunConfig defining the runtime environment for the estimator (including model_dir). hidden_units: [int], the layer sizes of the DNN (input layer first) warm_start_from: Optional directory to warm start from. Returns: Resulting DNNLinearCombinedClassifier. """metadata_dir=os.path.join(tf_transform_dir,transform_fn_io.TRANSFORMED_METADATA_DIR)transformed_metadata=metadata_io.read_metadata(metadata_dir)transformed_feature_spec=transformed_metadata.schema.as_feature_spec()transformed_feature_spec.pop(_transformed_name(_LABEL_KEY))real_valued_columns=[tf.feature_column.numeric_column(key,shape=())forkeyin_transformed_names(_DENSE_FLOAT_FEATURE_KEYS)]categorical_columns=[tf.feature_column.categorical_column_with_identity(key,num_buckets=_VOCAB_SIZE+_OOV_SIZE,default_value=0)forkeyin_transformed_names(_VOCAB_FEATURE_KEYS)]categorical_columns+=[tf.feature_column.categorical_column_with_identity(key,num_buckets=_FEATURE_BUCKET_COUNT,default_value=0)forkeyin_transformed_names(_BUCKET_FEATURE_KEYS)]categorical_columns+=[tf.feature_column.categorical_column_with_identity(key,num_buckets=num_buckets,default_value=0)forkey,num_bucketsinzip(_transformed_names(_CATEGORICAL_FEATURE_KEYS),#_MAX_CATEGORICAL_FEATURE_VALUES)]returntf.estimator.DNNLinearCombinedClassifier(config=config,linear_feature_columns=categorical_columns,dnn_feature_columns=real_valued_columns,dnn_hidden_units=hidden_unitsor[100,70,50,25],warm_start_from=warm_start_from)
تاريخ التعديل الأخير: 2025-07-25 (حسب التوقيت العالمي المتفَّق عليه)
[null,null,["تاريخ التعديل الأخير: 2025-07-25 (حسب التوقيت العالمي المتفَّق عليه)"],[],[],null,["# Designing TensorFlow Modeling Code For TFX\n\n\u003cbr /\u003e\n\nWhen designing your TensorFlow modeling code for TFX there are a few items to be\naware of, including the choice of a modeling API.\n\n- Consumes: SavedModel from [Transform](/tfx/guide/transform), and data from [ExampleGen](/tfx/guide/examplegen)\n- Emits: Trained model in SavedModel format\n\n| **Note:** TFX supports nearly all of TensorFlow 2.X, with minor exceptions. TFX also fully supports TensorFlow 1.15.\n|\n| - New TFX pipelines should use TensorFlow 2.x with Keras models via the [Generic Trainer](https://github.com/tensorflow/community/blob/master/rfcs/20200117-tfx-generic-trainer.md).\n| - Full support for TensorFlow 2.X, including improved support for tf.distribute, will be added incrementally in upcoming releases.\n| - Previous TFX pipelines can continue to use TensorFlow 1.15. To switch them to TensorFlow 2.X, see the [TensorFlow migration guide](https://www.tensorflow.org/guide/migrate).\n|\n| To keep up to date on TFX releases, see the [TFX OSS\n| Roadmap](https://github.com/tensorflow/tfx/blob/master/ROADMAP.md), read [the TFX\n| blog](https://blog.tensorflow.org/search?label=TFX&max-results=20) and subscribe to the [TensorFlow\n| newsletter](https://services.google.com/fb/forms/tensorflow/).\n\nYour model's input layer should consume from the SavedModel that was created by\na [Transform](/tfx/guide/transform) component, and the layers of the Transform model should\nbe included with your model so that when you export your SavedModel and\nEvalSavedModel they will include the transformations that were created by the\n[Transform](/tfx/guide/transform) component.\n\nA typical TensorFlow model design for TFX looks like this: \n\n def _build_estimator(tf_transform_dir,\n config,\n hidden_units=None,\n warm_start_from=None):\n \"\"\"Build an estimator for predicting the tipping behavior of taxi riders.\n\n Args:\n tf_transform_dir: directory in which the tf-transform model was written\n during the preprocessing step.\n config: tf.contrib.learn.RunConfig defining the runtime environment for the\n estimator (including model_dir).\n hidden_units: [int], the layer sizes of the DNN (input layer first)\n warm_start_from: Optional directory to warm start from.\n\n Returns:\n Resulting DNNLinearCombinedClassifier.\n \"\"\"\n metadata_dir = os.path.join(tf_transform_dir,\n transform_fn_io.TRANSFORMED_METADATA_DIR)\n transformed_metadata = metadata_io.read_metadata(metadata_dir)\n transformed_feature_spec = transformed_metadata.schema.as_feature_spec()\n\n transformed_feature_spec.pop(_transformed_name(_LABEL_KEY))\n\n real_valued_columns = [\n tf.feature_column.numeric_column(key, shape=())\n for key in _transformed_names(_DENSE_FLOAT_FEATURE_KEYS)\n ]\n categorical_columns = [\n tf.feature_column.categorical_column_with_identity(\n key, num_buckets=_VOCAB_SIZE + _OOV_SIZE, default_value=0)\n for key in _transformed_names(_VOCAB_FEATURE_KEYS)\n ]\n categorical_columns += [\n tf.feature_column.categorical_column_with_identity(\n key, num_buckets=_FEATURE_BUCKET_COUNT, default_value=0)\n for key in _transformed_names(_BUCKET_FEATURE_KEYS)\n ]\n categorical_columns += [\n tf.feature_column.categorical_column_with_identity(\n key, num_buckets=num_buckets, default_value=0)\n for key, num_buckets in zip(\n _transformed_names(_CATEGORICAL_FEATURE_KEYS), #\n _MAX_CATEGORICAL_FEATURE_VALUES)\n ]\n return tf.estimator.DNNLinearCombinedClassifier(\n config=config,\n linear_feature_columns=categorical_columns,\n dnn_feature_columns=real_valued_columns,\n dnn_hidden_units=hidden_units or [100, 70, 50, 25],\n warm_start_from=warm_start_from)"]]