A quick question, while analyzing the barren plateaus in Quantum neural networks (QNNs), is it compulsory to fix the depth of quantum layers and increase the number of qubits? would it be OK if we increase the depth accordingly with every increase in number of qubits. Not sure if it would make sense?
Thanks in advance
This is a great question!
Having more qubits is likely to make your Barren plateau problem worse. According to the definition of Barren plateaus, the problem gets worse as a function of the number of qubits
More depth won’t necessarily help either. It may cause overfitting or other issues.
Some strategies that might help are:
- Changing your cost function
- Trying a different initialization of your parameters
- Making sure you don’t have a Haar random initial circuit
We have a PennyLane demo on Barren plateaus in QNNs which can help you explore this issue.
I hope this helps!