I am currently using the codes of the following GitHub repo to benchmark a different imbalanced binary classification dataset:
https://github.com/Gruntrexpewrus/quantum-neural-networks/blob/master/fraud_detection/fraud_detection.py
Should I modify any value in this portion, given that the dataset has 14 independent features and 1 target feature? (Point to note that the following codes were based upon the dataset having a PCA of 29 features, that’s why probably the authors considered 10 PCAs. However, my dataset doesn’t contain any PCA features.)
# Input neurons
input_neurons = 10
# Widths of hidden layers
nn_architecture = [10, 10]
# Output neurons of classical part
output_neurons = 14
If I take input_neurons = 14, what should be the values of nn_architecture and output_neurons?
Hi @Abrar2652, welcome to the forum!
In principle you wouldn’t need to change this section. The values of these three variables can be independent. You could have 14 input neurons and then have smaller hidden layers.
From a practical perspective what you can do is take the original code, make sure it runs, and then modify nn_architecture
to make it smaller than the number of input neurons. If this change causes errors down the line then you will probably need to change other things in the code to make it work.
Let me know if this works for you!
1 Like
@CatalinaAlbornoz Thank you for your response. I was able to compile the code using the parameters you mentioned. I used [10, 10] hidden layers. However, it takes a lot of time to run a single script for training and testing (like half a day). Is there any possible GPU implementation of this analogous QNN that I might be able to integrate with this code?
Hi @Abrar2652,
You could pip install tensorflow-gpu
and use it instead of the normal tensorflow.
For increased speed you could also reduce the cutoff or the size of the problem, although this would lower the quality of your results.
Let me know if any of these options work for you!
1 Like
Alternatively, you can use PennyLane Tensorflow plug-in.
1 Like