Thanks @leongfy, I see the issue now This arises because cost_n
is computed via a side effect of your cost function. That is, your cost function is updating a global variable, rather than returning the cost.
When using Autograd with PennyLane, cost functions must be pure — they cannot perform side effects (such as updating external variables), otherwise:
- You will see
ArrayBox
objects rather than NumPy arrays, and
- The values stored via the side-effect will no longer be differentiable.
In this particular case, since you are not using computing the gradient of the side effect, it should be okay. You can use the qml.math.toarray()
function to convert any ArrayBox
to a NumPy array:
def cost_fn(params):
cost_0 = sum(circuit_0(params, wires=0))
cost_1 = sum(circuit_1(params, wires=0))
cost_n.append(qml.math.toarray(cost_0 / cost_1))
return -(cost_0 ** 2) / cost_1
This should now show you numeric values when you print cost_n
Note that the PennyLane optimizers also have the method step_and_cost()
, which allows you to extract the new parameter values and the cost function:
params = np.array([0.0], requires_grad=True)
max_iterations = 10
step_size = 0.01
cost_n = []
opt = qml.GradientDescentOptimizer(stepsize=step_size)
for n in range(max_iterations):
params, cost = opt.step_and_cost(cost_fn, params)
print(f"Step: {n} Cost: {cost}")
print("cost_n =", cost_n, type(cost_n))