View source on GitHub |

Interface that defines how to specify gradients for a quantum circuit.

This abstract class allows for the creation of gradient calculation procedures for (expectation values from) quantum circuits, with respect to a set of input parameter values. This allows one to backpropagate through a quantum circuit.

## Methods

`differentiate_analytic`

```
differentiate_analytic(
programs, symbol_names, symbol_values, pauli_sums, forward_pass_vals, grad
)
```

Specify how to differentiate a circuit with analytical expectation.

This is called at graph runtime by TensorFlow. `differentiate_analytic`

should calculate the gradient of a batch of circuits and return it
formatted as indicated below. See
`tfq.differentiators.ForwardDifference`

for an example.

#### Args:

:`programs`

`tf.Tensor`

of strings with shape [batch_size] containing the string representations of the circuits to be executed.:`symbol_names`

`tf.Tensor`

of strings with shape [n_params], which is used to specify the order in which the values in`symbol_values`

should be placed inside of the circuits in`programs`

.:`symbol_values`

`tf.Tensor`

of real numbers with shape [batch_size, n_params] specifying parameter values to resolve into the circuits specified by programs, following the ordering dictated by`symbol_names`

.:`pauli_sums`

`tf.Tensor`

of strings with shape [batch_size, n_ops] containing the string representation of the operators that will be used on all of the circuits in the expectation calculations.:`forward_pass_vals`

`tf.Tensor`

of real numbers with shape [batch_size, n_ops] containing the output of the forward pass through the op you are differentiating.:`grad`

`tf.Tensor`

of real numbers with shape [batch_size, n_ops] representing the gradient backpropagated to the output of the op you are differentiating through.

#### Returns:

A `tf.Tensor`

with the same shape as `symbol_values`

representing
the gradient backpropageted to the `symbol_values`

input of the op
you are differentiating through.

`differentiate_sampled`

```
differentiate_sampled(
programs, symbol_names, symbol_values, pauli_sums, num_samples,
forward_pass_vals, grad
)
```

Specify how to differentiate a circuit with sampled expectation.

This is called at graph runtime by TensorFlow. `differentiate_sampled`

should calculate the gradient of a batch of circuits and return it
formatted as indicated below. See
`tfq.differentiators.ForwardDifference`

for an example.

#### Args:

:`programs`

`tf.Tensor`

of strings with shape [batch_size] containing the string representations of the circuits to be executed.:`symbol_names`

`tf.Tensor`

of strings with shape [n_params], which is used to specify the order in which the values in`symbol_values`

should be placed inside of the circuits in`programs`

.:`symbol_values`

`tf.Tensor`

of real numbers with shape [batch_size, n_params] specifying parameter values to resolve into the circuits specified by programs, following the ordering dictated by`symbol_names`

.:`pauli_sums`

`tf.Tensor`

of strings with shape [batch_size, n_ops] containing the string representation of the operators that will be used on all of the circuits in the expectation calculations.:`num_samples`

`tf.Tensor`

of positive integers representing the number of samples per term in each term of pauli_sums used during the forward pass.:`forward_pass_vals`

`tf.Tensor`

of real numbers with shape [batch_size, n_ops] containing the output of the forward pass through the op you are differentiating.:`grad`

`tf.Tensor`

of real numbers with shape [batch_size, n_ops] representing the gradient backpropagated to the output of the op you are differentiating through.

#### Returns:

A `tf.Tensor`

with the same shape as `symbol_values`

representing
the gradient backpropageted to the `symbol_values`

input of the op
you are differentiating through.

`generate_differentiable_op`

```
generate_differentiable_op()
```

Generate a differentiable op by attaching self to an op.

This function returns a `tf.function`

that passes values through to
`forward_op`

during the forward pass and this differentiator (`self`

) to
backpropagate through the op during the backward pass. If sampled_op
is provided the differentiators `differentiate_sampled`

method will
be invoked (which requires sampled_op to be a sample based expectation
op with num_samples input tensor). If analytic_op is provided the
differentiators `differentiate_analytic`

method will be invoked (which
requires analytic_op to be an analytic based expectation op that does
NOT have num_samples as an input). If both sampled_op and analytic_op
are provided an exception will be raised.

*CAUTION*

This `generate_differentiable_op()`

can be called only ONCE because
of the `one differentiator per op`

policy. You need to call `refresh()`

to reuse this differentiator with another op.

#### Args:

: A`sampled_op`

`callable`

op that you want to make differentiable using this differentiator's`differentiate_sampled`

method.: A`analytic_op`

`callable`

op that you want to make differentiable using this differentiators`differentiate_analytic`

method.

#### Returns:

A `callable`

op that who's gradients are now registered to be
a call to this differentiators `differentiate_*`

function.

`refresh`

```
refresh()
```

Refresh this differentiator in order to use it with other ops.