tfr.keras.layers.create_tower
Stay organized with collections
Save and categorize content based on your preferences.
Creates a feed-forward network as tf.keras.Sequential
.
tfr.keras.layers.create_tower(
hidden_layer_dims: List[int],
output_units: int,
activation: Optional[Callable[..., tf.Tensor]] = None,
input_batch_norm: bool = False,
use_batch_norm: bool = True,
batch_norm_moment: float = 0.999,
dropout: float = 0.5,
name: Optional[str] = None,
**kwargs
)
It creates a feed-forward network with batch normalization and dropout, and
optionally applies batch normalization on inputs.
Example usage:
tower = create_tower(hidden_layer_dims=[64, 32, 16], output_units=1)
inputs = tf.ones([2, 3, 1])
tower_logits = tower(inputs)
Args |
hidden_layer_dims
|
Iterable of number hidden units per layer. All layers are
fully connected. Ex. [64, 32] means first layer has 64 nodes and second
one has 32.
|
output_units
|
Size of output logits from this tower.
|
activation
|
Activation function applied to each layer. If None , will use
an identity activation.
|
input_batch_norm
|
Whether to use batch normalization for input layer.
|
use_batch_norm
|
Whether to use batch normalization after each hidden layer.
|
batch_norm_moment
|
Momentum for the moving average in batch normalization.
|
dropout
|
When not None , the probability we will drop out a given
coordinate.
|
name
|
Name of the Keras layer.
|
**kwargs
|
Keyword arguments for every tf.keras.Dense layers.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-08-18 UTC.
[null,null,["Last updated 2023-08-18 UTC."],[],[],null,["# tfr.keras.layers.create_tower\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/ranking/blob/v0.5.3/tensorflow_ranking/python/keras/layers.py#L26-L77) |\n\nCreates a feed-forward network as [`tf.keras.Sequential`](https://www.tensorflow.org/api_docs/python/tf/keras/Sequential). \n\n tfr.keras.layers.create_tower(\n hidden_layer_dims: List[int],\n output_units: int,\n activation: Optional[Callable[..., tf.Tensor]] = None,\n input_batch_norm: bool = False,\n use_batch_norm: bool = True,\n batch_norm_moment: float = 0.999,\n dropout: float = 0.5,\n name: Optional[str] = None,\n **kwargs\n )\n\nIt creates a feed-forward network with batch normalization and dropout, and\noptionally applies batch normalization on inputs.\n\n#### Example usage:\n\n tower = create_tower(hidden_layer_dims=[64, 32, 16], output_units=1)\n inputs = tf.ones([2, 3, 1])\n tower_logits = tower(inputs)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|---------------------|-------------------------------------------------------------------------------------------------------------------------------------------------|\n| `hidden_layer_dims` | Iterable of number hidden units per layer. All layers are fully connected. Ex. `[64, 32]` means first layer has 64 nodes and second one has 32. |\n| `output_units` | Size of output logits from this tower. |\n| `activation` | Activation function applied to each layer. If `None`, will use an identity activation. |\n| `input_batch_norm` | Whether to use batch normalization for input layer. |\n| `use_batch_norm` | Whether to use batch normalization after each hidden layer. |\n| `batch_norm_moment` | Momentum for the moving average in batch normalization. |\n| `dropout` | When not `None`, the probability we will drop out a given coordinate. |\n| `name` | Name of the Keras layer. |\n| `**kwargs` | Keyword arguments for every `tf.keras.Dense` layers. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A [`tf.keras.Sequential`](https://www.tensorflow.org/api_docs/python/tf/keras/Sequential) object. ||\n\n\u003cbr /\u003e"]]