hey
In this Notebook you initialized the weights in shape of (4,7) actually I tried to modify this many times but none were good as your implementation’s output. Is there a way to find a general way of initializing weights? there are many ways in deep neural networks but do they fit in for quantum?
my guess is that it has something to do with the number of gates right? because I know the first parameter is the number of layers. what about the second one?!
@Maria_Schuld @Christian_Gogolin
@josh
You are right, the shape of the weights has to correspond to the way you access weights in the quantum node. In this case the weight matrix consists of 4 layers and 7 weights per layer.
Your comment actually made me aware that we only need 5 weights per layer in the layer(v)
function. It does not affect the code, but might be confusing. Will fix that!
See Pull Request #141.
I have already implemented it
and works great
thanks for your responses. that actually got me scratching my head for more than a week
try using it without 0.05 factor and it would give you better results
is there a ground rule for weight initialization or it’s problem dependent?
@Maria_Schuld @josh @Christian_Gogolin
This is one of many things that the research community has to find out, which is why PennyLane was written in the first place…