Hi, I found that when using qml.qnn.TorchLayer
, it recognizes batch dimension differently from version pennylane==0.30.0
against versions higher than pennylane==0.31.0
. Suppose the shape of a single batch of data is x=(N,D), where N and D are the dimension of batch and data respectively. Consider forwarding x through a qml.qnn.TorchLayer
L:

In
pennylane==0.30.0
It behaves correctly that L(x) is also in shape (N,\text{some output shape})
. To be more precise, if the dimension of x is now (N_1,N_2,N_3,D), the output dimension of L(X) will be (N_1,N_2,N_3,\text{some output shape}) 
In
pennylane>=0.31.0
It seems to forwarding all data in a batch as inputs. When I print out the shape ofinputs
in the circuit, the shape is exactly the shape of x, i.e., (N,D)
Here is the code for reproducing the error (or maybe it was designed on purpose ):
import sys
import torch
import torch.nn as nn
import pennylane as qml
print(f"Python version: {sys.version}")
print(f"PyTorch version: {torch.__version__}")
print(f"Pennylane version: {qml.__version__}")
device = qml.device("default.qubit", wires=2)
@qml.qnode(device)
def circuit(inputs):
print(f"circuit inputs shape = {inputs.shape}")
qml.Hadamard(wires=0)
qml.CNOT(wires=(0,1))
return qml.expval(qml.PauliX(0))
class Model(nn.Module):
def __init__(self):
super().__init__()
torch_layer = qml.qnn.TorchLayer(circuit, weight_shapes={})
self.net = nn.Sequential(torch_layer)
def forward(self, x):
print(f"model x shape = {x.shape}")
x = self.net(x)
print(f"output shape = {x.shape}")
return x
N, D = 5, 3
model = Model()
y = model(torch.rand(N, D))
Outputs in
0.30.0
:
model x shape = torch.Size([3, 2])
circuit inputs shape = torch.Size([2])
circuit inputs shape = torch.Size([2])
circuit inputs shape = torch.Size([2])
output shape = torch.Size([3])
Outputs in
0.31.0
:
model x shape = torch.Size([3, 2])
circuit inputs shape = torch.Size([3, 2])
RuntimeError: shape ‘[3]’ is invalid for input of size 1
Dependencies:
python 3.9.12
pip==23.3.1
torch==2.1.2