Hello, I hope everyone on the Xanadu team is having a good holiday season.
I have a few questions regarding using PyTorch gradients with PennyLane:
- I cannot find the source of this at the moment, but I recall seeing that if you want to calculate the gradient in a loss function you will need to use PennyLane with PyTorch. Is this still the case?
- If you use PennyLane with PyTorch you use have to use torch.autograd and all the PyTorch optimisers. How does this effect the use of the Parameter Shift rules. If we were to use torch.autograd, then send this to real quantum hardware, would we still use parameter shift rules and do the gradient calculation on real hardware? Or, is does the gradient calculation end up happening classically somewhere?
- We can use Strawberry Fields as a backend to PennyLane, and also use PyTorch at the same time. I have run into a problem with this specific setup. I can create a model like:
def layer(v):
qml.Rotation(v[0], wires=0)
qml.Squeezing(v[1], 0.0, wires=0)
qml.Rotation(v[2], wires=0)
qml.Displacement(v[3], 0.0, wires=0)
qml.Kerr(v[4], wires=0)
@qml.qnode(dev, interface='torch')
def quantum_neural_net(var, x=None):
qml.Displacement(x, 0.0, wires=0)
for v in var:
layer(v)
return qml.expval(qml.X(0))
Then probe it with:
num_layers = 2
theta_weights = torch.tensor(0.05*np.random.randn(num_layers, 5)
i = Variable(torch.tensor(1.0), requires_grad=True)
O = quantum_neural_net(theta_weights, x=i)
In this toy model though, when i try to find the gradient, it thinks I have broken the graph:
dx = torch.autograd.grad(outputs=O, inputs=i,
grad_outputs=O.data.new(O.shape).fill_(1),
create_graph=True, retain_graph=True)[0]
RuntimeError: One of the differentiated Tensors
appears to not have been used in the graph.
Set allow_unused=True if this is the desired behavior.
So, my question is, is the combination of PyTorch/SF and torch.autograd gradient calculation not available at the moment?
Thank you!
ps, I understand you are all on your break now, I don’t expect a reply or anything until you are all back! Have a good break.