Adding noise to a Keras hybrid NN

Hello i have a qnode in a keras NN defined as:

@qml.qnode(dev, interface="tf", grad_method="backprop")
def qnode(inputs, weights):
    for i in range(blocks):
        qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
        qml.templates.StronglyEntanglingLayers(weights[i], wires=range(n_qubits)) 
    return [qml.expval(qml.PauliZ(i)) for i in range(n_qubits)]

I would like to add some noise to mimick classical dropout layer. I have tried to used the noise from DEMO: https://pennylane.ai/qml/demos/tutorial_noisy_circuits.html . But probably the noise functions are not supported by Keras?! Any suggestion to implenent noise? Thanks in advance!

Hi @NikSchet,

Thank you for your question! :slightly_smiling_face:

The following example would simulate noise models by:

  • Using the default.mixed device that supports noise models
  • Applying qml.DepolarizingChannel noise channels within the quantum function

import pennylane as qml
from pennylane import numpy as np

n_qubits = 5

dev = qml.device('default.mixed', wires=n_qubits)

inputs = np.ones(5, requires_grad=False)
weights = np.ones((3,3,5,3), requires_grad=True)
blocks = 3
p = 0.01

@qml.qnode(dev, interface="tf", diff_method="best")
def qnode(inputs, weights):
    for i in range(blocks):
        qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
        qml.templates.StronglyEntanglingLayers(weights[i], wires=range(n_qubits))
        qml.DepolarizingChannel(p, wires=0)
        qml.DepolarizingChannel(p, wires=1)
    return [qml.expval(qml.PauliZ(i)) for i in range(n_qubits)]

weight_shapes = {"weights": (3,3,5,3)}
qlayer = qml.qnn.KerasLayer(qnode, weight_shapes, output_dim=2)
qlayer(inputs)
<tf.Tensor: shape=(5,), dtype=float32, numpy=
array([ 0.02523608,  0.0064983 ,  0.248738  , -0.07350533, -0.12045245],
      dtype=float32)>

Would this be something close to what you’d like to have?

Note: we’re setting diff_method="best" in the example which will correspond to the parameter-shift rule. The reason for this is that the default.mixed device doesn’t support backpropagation. Furthermore, in the original linked example passing grad_method="backprop" to the QNode will not take any effect, as the keyword is diff_method rather than grad_method.

Let us know if we could further help. :slightly_smiling_face:

1 Like

Thank you very much for the help.

Furthermore, in the original linked example passing grad_method="backprop" to the QNode will not take any effect, as the keyword is diff_method rather than grad_method .

So , it didnt make sense in the code shown before to add “backprop” (i havent enabled back-propagation) and instead i should have used :

grad_method=“diff_method”`

to get backpropagation?

No worries! In the code shown before, specifying grad_method="backprop" indeed doesn’t enable backpropagation.

It would have to be diff_method="backprop" passed to qml.qnode such that it’s

@qml.qnode(dev, interface="tf", diff_method="backprop")
def qnode(inputs, weights):

This will only work with default.qubit though.

Thank you very much, i suppose backpropagation is very important for a Hybrid and probably it will boost metrics (like accuracy , prediction grids , etc), so i will enable it for default.qubit using diff_method=“backprop” as suggested.

I was under the impression that backpropagation was by default enabled in hybrid models (by hybrid i reffer to a qnode sandwitched between classical layers).

Thanks!!

Hi @NikSchet, it is worth noting, that if no diff_method is passed to @qml.qnode, then diff_method="best" is the default. When possible, this will result in using backpropagation by default, as is the case for default.qubit.

As diff_method was not specified in the original snippet and grad_method was ignored, the QNode defaulted to diff_method="best". That is equivalent to passing diff_method="backpropagation" to the QNode decorator for the default.qubit device.

So when using default.qubit, backpropagation is being used both when no diff_method argument is passed or when passing diff_method="backpropagation" to the QNode decorator.

Passing the diff_method="backpropagation" argument is useful to make the differentiation method explicit and it also helps to swap to another diff method if need be.

thank you very much for the explanation:)

2 Likes

Hi @NikSchet,

An update here: we’ve recently focused on allowing backpropagation support for default.mixed and enhanced compatibility with TF and Keras.

See the following (approved) pull request that will be merged into master and is to be released with v0.25.0 of PennyLane: https://github.com/PennyLaneAI/pennylane/pull/2776

Hello! It looks like your only adding noise to the the first two wires? Is that true? If so, what about the others?
Thanks!

Hi @Corey,

You can keep applying noise channels to the other wires in the same way as Antal did for the first two.

We’ll be adding PennyLane noise models in the upcoming 0.37 release at the start of July.

These noise models will allow you to control the recipe for when to insert noisy channels after target operations. You can check out the (Work In Progress) example here ! :raised_hands:

That will be super handy! Looking forward to the additions!

That’s great to hear @Corey !

I look forward to hearing your thoughts on the new additions.