Qml built in optimizers with learning rate scheduling

Just wondering if the current pennylane framework supports learning rate scheduling with in-house optimizers like qml.AdamOptimizer or qml.GradientDescentOptimizer?

Hi @jkwan314, I haven’t seen explicit support for this but let me check with the team just in case. We will get back to you on this next week.

Hi @jkwan314! The optimizers built-in to PennyLane do not have learning rate scheduling. Those optimizers are designed for the NumPy interface, but if you are happy to switch to other interfaces like PyTorch then you could make use of the capabilities of their optimizers. For example, PyTorch does support learning rate scheduling.

One of the strengths of the built-in PennyLane optimizers is that quantum-specific optimizers like QNGOptimizer can be used. Please let us know if you have a usecase for combining quantum-specific optimizers with learning-rate scheduling.