tf.keras.layers.DenseFeatures
Stay organized with collections
Save and categorize content based on your preferences.
A layer that produces a dense Tensor
based on given feature_columns
.
tf.keras.layers.DenseFeatures(
feature_columns, trainable=True, name=None, **kwargs
)
Generally a single example in training data is described with FeatureColumns.
At the first layer of the model, this column oriented data should be converted
to a single Tensor
.
This layer can be called multiple times with different features.
This is the V1 version of this layer that uses variable_scope's to create
variables which works well with PartitionedVariables. Variable scopes are
deprecated in V2, so the V2 version uses name_scopes instead. But currently
that lacks support for partitioned variables. Use this if you need
partitioned variables.
Example:
price = numeric_column('price')
keywords_embedded = embedding_column(
categorical_column_with_hash_bucket("keywords", 10K), dimensions=16)
columns = [price, keywords_embedded, ...]
feature_layer = DenseFeatures(columns)
features = tf.io.parse_example(..., features=make_parse_example_spec(columns))
dense_tensor = feature_layer(features)
for units in [128, 64, 32]:
dense_tensor = tf.compat.v1.keras.layers.Dense(
units, activation='relu')(dense_tensor)
prediction = tf.compat.v1.keras.layers.Dense(1)(dense_tensor)
Args |
feature_columns
|
An iterable containing the FeatureColumns to use as
inputs to your model. All items should be instances of classes derived
from DenseColumn such as numeric_column , embedding_column ,
bucketized_column , indicator_column . If you have categorical
features, you can wrap them with an embedding_column or
indicator_column .
|
trainable
|
Boolean, whether the layer's variables will be updated via
gradient descent during training.
|
name
|
Name to give to the DenseFeatures.
|
**kwargs
|
Keyword arguments to construct a layer.
|
Raises |
ValueError
|
if an item in feature_columns is not a DenseColumn .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.keras.layers.DenseFeatures\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------|\n| [TensorFlow 2 version](/api_docs/python/tf/keras/layers/DenseFeatures) | [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/python/feature_column/dense_features.py#L30-L138) |\n\nA layer that produces a dense `Tensor` based on given `feature_columns`.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.DenseFeatures`](/api_docs/python/tf/compat/v1/keras/layers/DenseFeatures)\n\n\u003cbr /\u003e\n\n tf.keras.layers.DenseFeatures(\n feature_columns, trainable=True, name=None, **kwargs\n )\n\nGenerally a single example in training data is described with FeatureColumns.\nAt the first layer of the model, this column oriented data should be converted\nto a single `Tensor`.\n\nThis layer can be called multiple times with different features.\n\nThis is the V1 version of this layer that uses variable_scope's to create\nvariables which works well with PartitionedVariables. Variable scopes are\ndeprecated in V2, so the V2 version uses name_scopes instead. But currently\nthat lacks support for partitioned variables. Use this if you need\npartitioned variables.\n\n#### Example:\n\n price = numeric_column('price')\n keywords_embedded = embedding_column(\n categorical_column_with_hash_bucket(\"keywords\", 10K), dimensions=16)\n columns = [price, keywords_embedded, ...]\n feature_layer = DenseFeatures(columns)\n\n features = tf.io.parse_example(..., features=make_parse_example_spec(columns))\n dense_tensor = feature_layer(features)\n for units in [128, 64, 32]:\n dense_tensor = tf.compat.v1.keras.layers.Dense(\n units, activation='relu')(dense_tensor)\n prediction = tf.compat.v1.keras.layers.Dense(1)(dense_tensor)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `feature_columns` | An iterable containing the FeatureColumns to use as inputs to your model. All items should be instances of classes derived from `DenseColumn` such as `numeric_column`, `embedding_column`, `bucketized_column`, `indicator_column`. If you have categorical features, you can wrap them with an `embedding_column` or `indicator_column`. |\n| `trainable` | Boolean, whether the layer's variables will be updated via gradient descent during training. |\n| `name` | Name to give to the DenseFeatures. |\n| `**kwargs` | Keyword arguments to construct a layer. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|---------------------------------------------------------|\n| `ValueError` | if an item in `feature_columns` is not a `DenseColumn`. |\n\n\u003cbr /\u003e"]]