Extracting weigths of a Qnode Circuit object or the TorchLayer object?

Hi,

I wanted to know if there is a way to access weights of a Qnode circuit object or the weight_shape information on the TorchLayer object created from a qnode.

I tried accessing the qtape object on my qnode to access the circuit and its parameters (which are the weights in my case) as far as I understood but that doesn’t work and does not make sense.

Kindly help. :slight_smile:

Thank you!

Hey @kamzam! Welcome to the forum :muscle:

Great question! TorchLayer behaves exactly like native layers in PyTorch. So, you can use model.parameters() :slight_smile:. Here’s an example:

import pennylane as qml
import torch

n_qubits = 2
dev = qml.device("default.qubit", wires=n_qubits)

@qml.qnode(dev)
def qnode(inputs, weights):
    qml.AngleEmbedding(inputs, wires=range(n_qubits))
    qml.BasicEntanglerLayers(weights, wires=range(n_qubits))
    return [qml.expval(qml.PauliZ(wires=i)) for i in range(n_qubits)]

n_layers = 2
weight_shapes = {"weights": (n_layers, n_qubits)}

qlayer = qml.qnn.TorchLayer(qnode, weight_shapes)

clayer_1 = torch.nn.Linear(2, 2)
clayer_2 = torch.nn.Linear(2, 2)
softmax = torch.nn.Softmax(dim=1)
layers = [clayer_1, qlayer, clayer_2, softmax]
model = torch.nn.Sequential(*layers)

for param, layer in zip(model.parameters(), model.named_children()):
    print(param, layer)
Parameter containing:
tensor([[ 0.0943, -0.2240],
        [ 0.4012, -0.4323]], requires_grad=True) ('0', Linear(in_features=2, out_features=2, bias=True))
Parameter containing:
tensor([0.5016, 0.2354], requires_grad=True) ('1', <Quantum Torch Layer: func=qnode>)
Parameter containing:
tensor([[6.2790, 2.8270],
        [2.4543, 5.2021]], requires_grad=True) ('2', Linear(in_features=2, out_features=2, bias=True))
Parameter containing:
tensor([[ 0.4883,  0.1023],
        [-0.3336, -0.3181]], requires_grad=True) ('3', Softmax(dim=1))

Hope this helps! :smile:

2 Likes

Thanks Isaac, yes this makes sense. :slight_smile:

1 Like

Awesome! Glad I could help. Let us know if you have any other questions :rocket: