ERROR when using ECG dataset on quanvolutional neural nework demo(ValueError: RY: wrong number(s) of dimensions in parameters. Parameters with ndims (2,) passed, (0,) expected)

Hi…i am trying to modify the quanvolutional neural network demo in which they have used MINST Dataset for classification but i am trying to use MIT BIH Dataset for ecg classification using quanvolutional neural networks. i have already preprocessed the ecg dataset by converting the 1D ecg signal to 2D ECG images in .csv file. but using this data i have encountered some error kindly help me to understand the error and give me the best possible solution. its for my thesis work. Thanks

if PREPROCESS == True:
    q_train_images = []
    print("Quantum pre-processing of train images:")
    for idx, img in enumerate(train_images):
        print("{}/{}        ".format(idx + 1, n_train), end="\r")
        q_train_images.append(quanv(img))
    q_train_images = np.asarray(q_train_images)

    q_test_images = []
    print("\nQuantum pre-processing of test images:")
    for idx, img in enumerate(test_images):
        print("{}/{}        ".format(idx + 1, n_test), end="\r")
        q_test_images.append(quanv(img))
    q_test_images = np.asarray(q_test_images)

    # Save pre-processed images
    np.save(SAVE_PATH + "q_train_images.npy", q_train_images)
    np.save(SAVE_PATH + "q_test_images.npy", q_test_images)

    # Load pre-processed images
q_train_images = np.load(SAVE_PATH + "q_train_images.npy")
q_test_images = np.load(SAVE_PATH + "q_test_images.npy")

below is the error

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-46-047b3a4e5f4c> in <module>
      4     for idx, img in enumerate(train_images):
      5         print("{}/{}        ".format(idx + 1, n_train), end="\r")
----> 6         q_train_images.append(quanv(img))
      7     q_train_images = np.asarray(q_train_images)
      8 

8 frames
/usr/local/lib/python3.8/dist-packages/pennylane/operation.py in _check_batching(self, params)
    926             ]
    927             if not all(correct or batched for correct, batched in ndims_matches):
--> 928                 raise ValueError(
    929                     f"{self.name}: wrong number(s) of dimensions in parameters. "
    930                     f"Parameters with ndims {ndims} passed, {self.ndim_params} expected."

**ValueError: RY: wrong number(s) of dimensions in parameters. Parameters with ndims (2,) passed, (0,) expected**.
type or paste code here

Hello @Maham_Iftikhar Welcome to the Forum!

It seems that the image processing might be an issue here. We might be able to help you more if we knew what train_images looks like, or how it was produced.

Cheers,

Alvaro

@Alvaro_Ballon i am sharing the github link with you. i have used the same approach as given in this link. ECG_Classification_with_2D_CNN/ECG_Heartbeat_Classification_2Dconv.02b.etl-Copy1.ipynb at main · celiedel/ECG_Classification_with_2D_CNN · GitHub
it will help you to understand it easily.

Hi @Maham_Iftikhar, Thank you for sharing the code. Could you please send us a minimal example that replicates your error? Maybe try with a code that uses only a single image. I believe that the error is that your image has 2 dimensions and PennyLane thinks you’re sending a batch. You probably need to turn that 2D data into 1D and then train based on this.

Please let me know if this helps and if you can send us a small but self-contained example of your issue we can help you better.

Thanks @CatalinaAlbornoz for your suggestion. Error is solved now, actually it was due to the invalid shape of training images which in my case was not 28x28 pixels as used in the pennylane code using MINST dataset. By reshaping the train images my error eliminates and the code works properly.

I’m glad you could find the solution @Maham_Iftikhar ! And thank you for sharing it here :slight_smile: