Pennylane optimizer with hybrid QNN

Hello,

How can I use the pennylane optimizers in Hybrid neural network? the model works fine with tensorflow optimizers like Adam, SGD etc., but when I try and run the same model with pennylane optimizers like QNGOptimizer etc, the following error pops up.

ValueError: Could not interpret optimizer identifier: <pennylane.optimize.qng.QNGOptimizer object at 0x7f619aa4c810>

the same error pops up for other optimizers like AdamOptimizer etc as well. Can some one please direct me how can I replace the optimizer with pennylane optimizer in this tutorial on hybrid NN.

Thank you…

Hi @Muhammad_Kashif, thanks for your question :slightly_frowning_face:

All of the optimizers that are included inside PennyLane are meant to be used with the PennyLane-provided version of numpy, i.e., the version of numpy you get when you do from pennylane import numpy as np. So if you want to port a tutorial to work with PennyLane’s built-in optimizers, you’ll have to rework that tutorial’s code from TensorFlow or PyTorch (though a great many tutorials are already written in numpy).

Note that some built-in optimizers, like AdamOptimizer should work out of the box as long as your hybrid model is coded in PennyLane-provided NumPy. Others, like QNGOptimizer are more “quantum” than “hybrid”, i.e., they are designed to work with quantum circuits, but may not necessarily extend to full hybrid models.

Hi,

Sorry for asking in abit older thread, but my question was related to this topic.

Can we use pennylane optimizers like AdamOptimizer, GradientDescentOptimizer etc with keras model.compile, because when I try to use these optimizers with model.compile, I get the following error:

Could not interpret optimizer identifier: <pennylane.optimize.gradient_descent.GradientDescentOptimizer object at 0x000002CE1E571EE0>

PS: I am using numpy from pennylane i,.e., from pennylane import numpy as np

Secondly, are these optimizers (available in pennylane) particularly AdamOptimizer and GradientDescentOptimizer are different to that tensorflow optimizers (Adm and SGD).

Thanks for the help.

Hey @Muhammad_Kashif!

Can we use pennylane optimizers like AdamOptimizer , GradientDescentOptimizer etc with keras model.compile

The built-in PennyLane optimizers are designed for the NumPy/Autograd interface and are not compatible with TensorFlow. However, gradient descent, Adam, and many common optimizers are already available in TensorFlow and can be used with a Keras model.

Check out the model.compile method for more details on how to specify the TensorFlow-based optimizer. This should be as simple as setting optimizer='adam' if you are happy with the default settings of the TF Adam optimizer.

Secondly, are these optimizers (available in pennylane) particularly AdamOptimizer and GradientDescentOptimizer are different to that tensorflow optimizers (Adm and SGD) .

They are implemented differently (i.e., TensorFlow vs Autograd compatibility), but use the same underlying algorithm.