Possible discrepancy in the "Quantum gradients with backpropagation" tutorial

Hi, I have a basic question related to the above-mentioned tutorial (Quantum gradients with backpropagation | PennyLane Demos).

For the parameter shift example, I observed that we obtained different answers to the gradient when we manually created the function as compared to when we used the built-in jax.grad or qml.gradients.param_shift functions.

Is this expected?

The former results in a value
[-4.86500983e-01 -8.39988913e-02 -5.55111512e-17 -4.91965937e-01, -6.09742831e-01 0.00000000e+00]

while the latter results in
(Array(-0.01981529, dtype=float64), Array(-0.04948435, dtype=float64), Array(1.16219626e-16, dtype=float64), Array(-0.0307452, dtype=float64), Array(-0.26629899, dtype=float64), Array(-5.03138276e-17, dtype=float64))

Hey @singhmankrit, welcome to the forum :rocket:

This is definitely a bug that we need to look at :eyes:. I’ve let the author know and we’ll look into it :slight_smile:. We’ll track the progress here: [BUG] Backprop tutorial has an incorrect gradient result compared to other built-in methods · Issue #1050 · PennyLaneAI/qml · GitHub

Thanks for catching this @singhmankrit! Looks like there was a bug in the demo, which is corrected here: Fix a bug in the backprop tutorial by josh146 · Pull Request #1051 · PennyLaneAI/qml · GitHub

Great! Thanks for looking into this! :slight_smile:

1 Like