Is it feasible to use transfer learning to train a quantumclassical hybrid or even quanvolutional model on a somewhat large dataset such as Skin Cancer MNIST [https://www.kaggle.com/kmader/skincancermnistham10000]?
Hi @quantumHS! Embedding large datasets within variational quantum circuits is tricky, since common methods for data embedding, such as amplitude embedding or angle embedding require scaling up the number of qubits as the dataset size increases. In this case, you have two options:

A simulator: will become exponentially harder to simulate as the number of qubits increases.

A hardware device: publicly accessible quantum hardware devices are still quite small.
Two potential alternatives I can think of:

Data reuploading. Here, the circuit depth increases with the size of the embedding data, while the qubit number remains constant. Hardware devices might still struggle, however, since they are limited to relatively short circuit depth.

Alternatively, you could try training an autoencoder to reduce the dimension of the dataset (while preserving the features) before embedding it in your quantum circuit.