I have been going through the official tutorials on quantum neural networks (QNN) and Quanvolution neural network(QuNN): I have some questions regarding both in comparsion to classical NN and Classical convolutional NN:
QNN: Referred Tutorial
- I have tried running the same model for MNIST dataset (both with quantum layer(s) and without) and it turns out that quantum layer(s) do not have any impact on the validation accuracy, I even tried adding more quantum layers and even increasing the number of qubits/layer, but still the validation accuracy turns out to be the same (more or less) in both with and without quantum layers. Now my question is what is the point of quantum circuit/layers addition if it is not affecting the model training at all?
I believe the quantum parameters are not being trained, if I am right, and this is the reason that quantum layers are not affecting the model learning, then how can we make them trainable and see if it is improving the training as compared to classical NN?
- is it necessary to have the first and last layers of hybrid QNN to be the classical layers? Can we have complete QNN without any classical layers at the moment?
QuNN: This tutorial
- More suitable for image classification problems, right? However, the Quanvolution of MNIST images takes way too much time, I have tried for 5000 training and 1000 test images and it took around 5+ hours roughly on google colab, which roughly maps to around 60 hours for complete dataset (70000 images). Is it even practical for MNIST dataset, which is considered to be the Hello world dataset for classical CNNs.
we need a reasonably large dataset to guage the performance since for this small data, even a very simple CNN model overfits in a couple of epochs, and the same would be the case with QuNNs, i suppose.
- Even for small (5000/1000) dataset, the quanvoluted data performs almost the same as convoluted dataset. What is the point of doing quantum convolution (seems very impractical compared to classical counterpart). In tutorial the quanvoluted images are processed by classical NN so I thought may be some quantum layers addition would better process the quanvoluted images but no significant effect(might be because of non-trainable quantum parameters, discussed above).
In both models (QNN and QUNN), the optimizer used is a classical one, if we can make the quantum parameters trainable, it sure would need a quantum optimizer, right? But I am not sure how to integrate it here, particularly in hybrid setting, any light on that would be beneficial?
Thanks for this great discussion forum. It has been and hopefully will be quite helpful.