tfr.keras.layers.create_tower
Creates a feed-forward network as tf.keras.Sequential
.
tfr.keras.layers.create_tower(
hidden_layer_dims: List[int],
output_units: int,
activation: Optional[Callable[..., tf.Tensor]] = None,
input_batch_norm: bool = False,
use_batch_norm: bool = True,
batch_norm_moment: float = 0.999,
dropout: float = 0.5,
name: Optional[str] = None,
**kwargs
)
It creates a feed-forward network with batch normalization and dropout, and
optionally applies batch normalization on inputs.
Example usage:
tower = create_tower(hidden_layer_dims=[64, 32, 16], output_units=1)
inputs = tf.ones([2, 3, 1])
tower_logits = tower(inputs)
Args |
hidden_layer_dims
|
Iterable of number hidden units per layer. All layers are
fully connected. Ex. [64, 32] means first layer has 64 nodes and second
one has 32.
|
output_units
|
Size of output logits from this tower.
|
activation
|
Activation function applied to each layer. If None , will use
an identity activation.
|
input_batch_norm
|
Whether to use batch normalization for input layer.
|
use_batch_norm
|
Whether to use batch normalization after each hidden layer.
|
batch_norm_moment
|
Momentum for the moving average in batch normalization.
|
dropout
|
When not None , the probability we will drop out a given
coordinate.
|
name
|
Name of the Keras layer.
|
**kwargs
|
Keyword arguments for every tf.keras.Dense layers.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-08-18 UTC.
[null,null,["Last updated 2023-08-18 UTC."],[],[]]