![]() |
DP subclass of tf.keras.Sequential
.
tf_privacy.DPSequential(
l2_norm_clip, noise_multiplier, use_xla=True, *args, **kwargs
)
This can be used as a differentially private replacement for tf.keras.Sequential. This class implements DP-SGD using the standard Gaussian mechanism.
When instantiating this class, you need to supply several
DP-related arguments followed by the standard arguments for
Sequential
.
Examples:
# Create Model instance.
model = DPSequential(l2_norm_clip=1.0, noise_multiplier=0.5, use_xla=True,
<standard arguments>)
You should use your DPSequential instance with a standard instance
of tf.keras.Optimizer
as the optimizer, and a standard reduced loss.
You do not need to use a differentially private optimizer.
# Use a standard (non-DP) optimizer.
optimizer = tf.keras.optimizers.SGD(learning_rate=0.01)
# Use a standard reduced loss.
loss = tf.keras.losses.MeanSquaredError()
model.compile(optimizer=optimizer, loss=loss)
model.fit(train_data, train_labels, epochs=1, batch_size=32)
Args | |
---|---|
l2_norm_clip
|
Clipping norm (max L2 norm of per microbatch gradients). |
noise_multiplier
|
Ratio of the standard deviation to the clipping norm. |
use_xla
|
If True , compiles train_step to XLA.
|
*args
|
These will be passed on to the base class __init__ method.
|
**kwargs
|
These will be passed on to the base class __init__
method.
|
Methods
add
add(
layer
)
Adds a layer instance on top of the layer stack.
Args | |
---|---|
layer
|
layer instance. |
Raises | |
---|---|
TypeError
|
If layer is not a layer instance.
|
ValueError
|
In case the layer argument does not
know its input shape.
|
ValueError
|
In case the layer argument has
multiple output tensors, or is already connected
somewhere else (forbidden in Sequential models).
|
pop
pop()
Removes the last layer in the model.
Raises | |
---|---|
TypeError
|
if there are no layers in the model. |