tfc.layers.GDNParameter

Nonnegative parameterization as needed for GDN parameters.

Inherits From: Parameter

The variable is subjected to an invertible transformation that slows down the learning rate for small values.

initial_value tf.Tensor or None. The initial value of the kernel. If not provided, its shape must be given, and the initial value of the parameter will be undefined.
name String. The name of the parameter.
minimum Float. Lower bound for the parameter (defaults to zero).
offset Float. Offset added to the reparameterization. The parameterization of beta/gamma as their square roots lets the training slow down when values are close to zero, which is desirable as small values in the denominator can lead to a situation where gradient noise on beta/gamma leads to extreme amounts of noise in the GDN activations. However, without the offset, we would get zero gradients if any elements of beta or gamma were exactly zero, and thus the training could get stuck. To prevent this, we add this small constant. The default value was empirically determined as a good starting point. Making it bigger potentially leads to more gradient noise on the activations, making it too small may lead to numerical precision issues.
shape tf.TensorShape or compatible. Ignored unless initial_value is None.
dtype tf.dtypes.DType or compatible. DType of this parameter. If not given, inferred from initial_value.

minimum Float. The minimum parameter provided on initialization.
offset Float. The offset parameter provided on initialization.
variable tf.Variable. The reparameterized variable.
name Returns the name of this module as passed or determined in the ctor.

name_scope Returns a tf.name_scope instance for this class.
non_trainable_variables Sequence of non-trainable variables owned by this module and its submodules.
submodules Sequence of all sub-modules.

Submodules are modules which are properties of this module, or found as properties of modules which are properties of this module (and so on).

a = tf.Module()
b = tf.Module()
c = tf.Module()
a.b = b
b.c = c
list(a.submodules) == [b, c]
True
list(b.submodules) == [c]
True
list(c.submodules) == []
True

trainable_variables Sequence of trainable variables owned by this module and its submodules.

variables Sequence of variables owned by this module and its submodules.

Methods

get_config

View source

Returns the configuration of the Parameter.

get_weights

View source

set_weights

View source

with_name_scope

Decorator to automatically enter the module name scope.

class MyModule(tf.Module):
  @tf.Module.with_name_scope
  def __call__(self, x):
    if not hasattr(self, 'w'):
      self.w = tf.Variable(tf.random.normal([x.shape[1], 3]))
    return tf.matmul(x, self.w)

Using the above module would produce tf.Variables and tf.Tensors whose names included the module name:

mod = MyModule()
mod(tf.ones([1, 2]))
<tf.Tensor: shape=(1, 3), dtype=float32, numpy=..., dtype=float32)>
mod.w
<tf.Variable 'my_module/Variable:0' shape=(2, 3) dtype=float32,
numpy=..., dtype=float32)>

Args
method The method to wrap.

Returns
The original method wrapped such that it enters the module's name scope.

__call__

View source

Computes and returns the non-negative value as a tf.Tensor.