|View source on GitHub|
Differentiate a circuit with respect to its inputs by
tfq.differentiators.LinearCombination( weights, perturbations )
linearly combining values obtained by evaluating the op using parameter values perturbed about their forward-pass values.
my_op = tfq.get_expectation_op()
weights = [5, 6, 7]
perturbations = [0, 0.5, 0.25]
linear_differentiator = tfq.differentiators.LinearCombination(
# Get an expectation op, with this differentiator attached.
op = linear_differentiator.generate_differentiable_op(
qubit = cirq.GridQubit(0, 0)
circuit = tfq.convert_to_tensor([
cirq.Circuit(cirq.X(qubit) ** sympy.Symbol('alpha'))
psums = tfq.convert_to_tensor([[cirq.Z(qubit)]])
symbol_values_array = np.array([[0.123]], dtype=np.float32)
# Calculate tfq gradient.
symbol_values_tensor = tf.convert_to_tensor(symbol_values_array)
with tf.GradientTape() as g:
expectations = op(circuit, ['alpha'], symbol_values_tensor, psums
# Gradient would be: 5 * f(x+0) + 6 * f(x+0.5) + 7 * f(x+0.25)
grads = g.gradient(expectations, symbol_values_tensor)
# Note: this gradient visn't correct in value, but showcases
# the principle of how gradients can be defined in a very flexible
tf.Tensor([[5.089467]], shape=(1, 1), dtype=float32)
listof real numbers representing linear combination coeffecients for each perturbed function evaluation.
listof real numbers representing perturbation values.
differentiate_analytic( programs, symbol_names, symbol_values, pauli_sums, forward_pass_vals, grad )
differentiate_sampled( programs, symbol_names, symbol_values, pauli_sums, num_samples, forward_pass_vals, grad )
Generate a differentiable op by attaching self to an op.
This function returns a
tf.function that passes values through to
forward_op during the forward pass and this differentiator (
backpropagate through the op during the backward pass. If sampled_op
is provided the differentiators
differentiate_sampled method will
be invoked (which requires sampled_op to be a sample based expectation
op with num_samples input tensor). If analytic_op is provided the
differentiate_analytic method will be invoked (which
requires analytic_op to be an analytic based expectation op that does
NOT have num_samples as an input). If both sampled_op and analytic_op
are provided an exception will be raised.
generate_differentiable_op() can be called only ONCE because
one differentiator per op policy. You need to call
to reuse this differentiator with another op.
callableop that you want to make differentiable using this differentiator's
callableop that you want to make differentiable using this differentiators
callable op that who's gradients are now registered to be
a call to this differentiators
Refresh this differentiator in order to use it with other ops.