Why qml. GradientDescentOptimizer() does not update parameters at every step?

Hello, I am trying to minimize a cost function by using qml.GradientDescentOptimizer(), however it seems not to be upgrading the parameters at each step. To ilustrate my problem with a simple example. I create a circuit and compute the trace of the resulting density matrix squared:

import pennylane as qml
from pennylane import numpy as np

device = qml.device(name='default.qubit', wires=4)

@qml.qnode(device, interface="autograd")

def circuit(params):
    qml.RY(np.pi,wires=[0,1])
    qml.RY(params[0],wires=0)
    qml.RY(params[0],wires=1)
    
    qml.Barrier(wires = range(4))
    
    qml.RY(params[1],wires=1)
    qml.RY(params[1],wires=2)
    
    qml.Barrier(wires = range(4))
    
    qml.RY(params[2],wires=2)
    qml.RY(params[2],wires=3)
    
    
    return qml.density_matrix(wires=range(4))

def simple_func(matrix):
    return np.abs(np.trace(matrix))

def cost_func(params):
    densityMatrix = circuit(params)
    return simple_func(np.matmul(densityMatrix,densityMatrix))

I know I can take the trace directly, but in my real code I have to call a function inside the cost function, (anyways taking the trace directly also reproduce the problem). I want to optimize the cost function.

init_params = np.array([np.pi/2,np.pi*5,-1*np.pi/2.], requires_grad=True)
opt = qml.GradientDescentOptimizer(stepsize=0.5)

steps = 50
params = init_params 

for i in range(steps):
    params = opt.step(cost_func,params)
    print(params, cost_func(params))

And the problem is that the parameters seems no to update at each step. If a use a bigger step they change a little in the fist to cycles but then remain static in one particular value. I am new in this type of optimization so maybe there is something I’m ignoring related to the step or the cost function. The printed information is this:

[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007
[ 1.57079633 15.70796327 -1.57079633] 1.0000000000000007

Here is my information:

Name: PennyLane
Version: 0.23.0
Summary: PennyLane is a Python quantum machine learning library by Xanadu Inc.
Home-page: https://github.com/XanaduAI/pennylane
Author: 
Author-email: 
License: Apache License 2.0
Location: c:\users\kryst\miniconda3\envs\myjup\lib\site-packages
Requires: appdirs, autograd, autoray, cachetools, networkx, numpy, pennylane-lightning, retworkx, scipy, semantic-version, toml
Required-by: PennyLane-Lightning

Platform info:           Windows-10-10.0.22621-SP0
Python version:          3.9.18
Numpy version:           1.26.0
Scipy version:           1.11.3
Installed devices:
- default.gaussian (PennyLane-0.23.0)
- default.mixed (PennyLane-0.23.0)
- default.qubit (PennyLane-0.23.0)
- default.qubit.autograd (PennyLane-0.23.0)
- default.qubit.jax (PennyLane-0.23.0)
- default.qubit.tf (PennyLane-0.23.0)
- default.qubit.torch (PennyLane-0.23.0)
- lightning.qubit (PennyLane-Lightning-0.23.0)

I’m in Windows 11 Home Single Language Version 22H2, and I installed PennyLane with miniconda3 latest distribution.

Hi @Krys !

It’s good to see this question here.

I noticed a couple of things:

  1. You’re using a very old version of PennyLane. If possible please upgrade using python -m pip install pennylane --upgrade
  2. All optimizers are different! You can check out the different optimizers in PennyLane here in the docs. You can try using different ones to see if one of them helps you solve your problem.
  3. Some problems require extra work in order to be optimizable. Sometimes the optimization landscape is very flat, or you have a lot of local minima. You can read for example about Barren Plateaus in this demo, and learn about local and global cost functions in this demo.

I hope these insights and resources can help you get going with your problem!