|  View source on GitHub | 
Differentiate a circuit with respect to its inputs by linearly combining values obtained by evaluating the op using parameter values perturbed about their forward-pass values.
Inherits From: Differentiator
tfq.differentiators.LinearCombination(
    weights, perturbations
)
my_op = tfq.get_expectation_op()weights = [5, 6, 7]perturbations = [0, 0.5, 0.25]linear_differentiator = tfq.differentiators.LinearCombination(weights, perturbations)# Get an expectation op, with this differentiator attached.op = linear_differentiator.generate_differentiable_op(analytic_op=my_op)qubit = cirq.GridQubit(0, 0)circuit = tfq.convert_to_tensor([cirq.Circuit(cirq.X(qubit) ** sympy.Symbol('alpha'))])psums = tfq.convert_to_tensor([[cirq.Z(qubit)]])symbol_values = np.array([[0.123]], dtype=np.float32)# Calculate tfq gradient.symbol_values_t = tf.convert_to_tensor(symbol_values)symbol_names = tf.convert_to_tensor(['alpha'])with tf.GradientTape() as g:g.watch(symbol_values_t)expectations = op(circuit, symbol_names, symbol_values_t, psums)# Gradient would be: 5 * f(x+0) + 6 * f(x+0.5) + 7 * f(x+0.25)grads = g.gradient(expectations, symbol_values_t)# Note: this gradient visn't correct in value, but showcases# the principle of how gradients can be defined in a very flexible# fashion.gradstf.Tensor([[5.089467]], shape=(1, 1), dtype=float32)
Methods
differentiate_analytic
@tf.functiondifferentiate_analytic( programs, symbol_names, symbol_values, pauli_sums, forward_pass_vals, grad )
Differentiate a circuit with analytical expectation.
This is called at graph runtime by TensorFlow. differentiate_analytic
calls he inheriting differentiator's get_gradient_circuits and uses
those components to construct the gradient.
| Args | |
|---|---|
| programs | tf.Tensorof strings with shape [batch_size] containing
the string representations of the circuits to be executed. | 
| symbol_names | tf.Tensorof strings with shape [n_params], which
is used to specify the order in which the values insymbol_valuesshould be placed inside of the circuits inprograms. | 
| symbol_values | tf.Tensorof real numbers with shape
[batch_size, n_params] specifying parameter values to resolve
into the circuits specified by programs, following the ordering
dictated bysymbol_names. | 
| pauli_sums | tf.Tensorof strings with shape [batch_size, n_ops]
containing the string representation of the operators that will
be used on all of the circuits in the expectation calculations. | 
| forward_pass_vals | tf.Tensorof real numbers with shape
[batch_size, n_ops] containing the output of the forward pass
through the op you are differentiating. | 
| grad | tf.Tensorof real numbers with shape [batch_size, n_ops]
representing the gradient backpropagated to the output of the
op you are differentiating through. | 
| Returns | |
|---|---|
| A tf.Tensorwith the same shape assymbol_valuesrepresenting
the gradient backpropageted to thesymbol_valuesinput of the op
you are differentiating through. | 
differentiate_sampled
@tf.functiondifferentiate_sampled( programs, symbol_names, symbol_values, pauli_sums, num_samples, forward_pass_vals, grad )
Differentiate a circuit with sampled expectation.
This is called at graph runtime by TensorFlow. differentiate_sampled
calls he inheriting differentiator's get_gradient_circuits and uses
those components to construct the gradient.
| Args | |
|---|---|
| programs | tf.Tensorof strings with shape [batch_size] containing
the string representations of the circuits to be executed. | 
| symbol_names | tf.Tensorof strings with shape [n_params], which
is used to specify the order in which the values insymbol_valuesshould be placed inside of the circuits inprograms. | 
| symbol_values | tf.Tensorof real numbers with shape
[batch_size, n_params] specifying parameter values to resolve
into the circuits specified by programs, following the ordering
dictated bysymbol_names. | 
| pauli_sums | tf.Tensorof strings with shape [batch_size, n_ops]
containing the string representation of the operators that will
be used on all of the circuits in the expectation calculations. | 
| num_samples | tf.Tensorof positive integers representing the
number of samples per term in each term of pauli_sums used
during the forward pass. | 
| forward_pass_vals | tf.Tensorof real numbers with shape
[batch_size, n_ops] containing the output of the forward pass
through the op you are differentiating. | 
| grad | tf.Tensorof real numbers with shape [batch_size, n_ops]
representing the gradient backpropagated to the output of the
op you are differentiating through. | 
| Returns | |
|---|---|
| A tf.Tensorwith the same shape assymbol_valuesrepresenting
the gradient backpropageted to thesymbol_valuesinput of the op
you are differentiating through. | 
generate_differentiable_op
generate_differentiable_op(
    *, sampled_op=None, analytic_op=None
)
Generate a differentiable op by attaching self to an op.
This function returns a tf.function that passes values through to
forward_op during the forward pass and this differentiator (self) to
backpropagate through the op during the backward pass. If sampled_op
is provided the differentiators differentiate_sampled method will
be invoked (which requires sampled_op to be a sample based expectation
op with num_samples input tensor). If analytic_op is provided the
differentiators differentiate_analytic method will be invoked (which
requires analytic_op to be an analytic based expectation op that does
NOT have num_samples as an input). If both sampled_op and analytic_op
are provided an exception will be raised.
This generate_differentiable_op() can be called only ONCE because
of the one differentiator per op policy. You need to call refresh()
to reuse this differentiator with another op.
| Args | |
|---|---|
| sampled_op | A callableop that you want to make differentiable
using this differentiator'sdifferentiate_sampledmethod. | 
| analytic_op | A callableop that you want to make differentiable
using this differentiatorsdifferentiate_analyticmethod. | 
| Returns | |
|---|---|
| A callableop that who's gradients are now registered to be
a call to this differentiatorsdifferentiate_*function. | 
get_gradient_circuits
@tf.functionget_gradient_circuits( programs, symbol_names, symbol_values )
See base class description.
refresh
refresh()
Refresh this differentiator in order to use it with other ops.