Returns the single element of the dataset as a nested structure of tensors. (deprecated)

The function enables you to use a in a stateless "tensor-in tensor-out" expression, without creating an iterator. This facilitates the ease of data transformation on tensors using the optimized abstraction on top of them.

For example, lets consider a preprocessing_fn which would take as an input the raw features and returns the processed feature along with it's label.

def preprocessing_fn(raw_feature):
  # ... the raw_feature is preprocessed as per the use-case
  return feature

raw_features = ...  # input batch of BATCH_SIZE elements.
dataset = (
           .map(preprocessing_fn, num_parallel_calls=BATCH_SIZE)

processed_features =

In the above example, the raw_features tensor of length=BATCH_SIZE was converted to a Next, each of the raw_feature was mapped using the preprocessing_fn and the processed features were grouped into a single batch. The final dataset contains only one element which is a batch of all the processed features.

Now, instead of creating an iterator for the dataset and retrieving the batch of features, the function is used to skip the iterator creation process and directly output the batch of features.

This can be particularly useful when your tensor transformations are expressed as operations, and you want to use those transformations while serving your model.


model = ... # A pre-built or custom model

class PreprocessingModel(tf.keras.Model):
  def __init__(self, model):
    self.model = model

  def serving_fn(self, data):
    ds =
    ds =, num_parallel_calls=BATCH_SIZE)
    ds = ds.batch(batch_size=BATCH_SIZE)
    return tf.argmax(

preprocessing_model = PreprocessingModel(model)
your_exported_model_dir = ... # save the model to this path., your_exported_model_dir,
              signatures={'serving_default': preprocessing_model.serving_fn})


In the case of estimators, you need to generally define a serving_input_fn which would require the features to be processed by the model while inferencing.

def serving_input_fn():

  raw_feature_spec = ... # Spec for the raw_features
  input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
      raw_feature_spec, default_batch_size=None)
  serving_input_receiver = input_fn()
  raw_features = serving_input_receiver.features

  def preprocessing_fn(raw_feature):
    # ... the raw_feature is preprocessed as per the use-case
    return feature

  dataset = (
            .map(preprocessing_fn, num_parallel_calls=BATCH_SIZE)

  processed_features =

  # Please note that the value of `BATCH_SIZE` should be equal to
  # the size of the leading dimension of `raw_features`. This ensures
  # that `dataset` has only element, which is a pre-requisite for
  # using ``.

  return tf.estimator.export.ServingInputReceiver(
      processed_features, serving_input_receiver.receiver_tensors)

estimator = ... # A pre-built or custom estimator
estimator.export_saved_model(your_exported_model_dir, serving_input_fn)

dataset A object containing a single element.

A nested structure of tf.Tensor objects, corresponding to the single element of dataset.

TypeError if dataset is not a object.
InvalidArgumentError (at runtime) if dataset does not contain exactly one element.