hey

In this Notebook you initialized the weights in shape of (4,7) actually I tried to modify this many times but none were good as your implementation’s output. Is there a way to find a general way of initializing weights? there are many ways in deep neural networks but do they fit in for quantum?

my guess is that it has something to do with the number of gates right? because I know the first parameter is the number of layers. what about the second one?!

@Maria_Schuld @Christian_Gogolin

@josh

# Weights of neural network

**Maria_Schuld**#2

You are right, the shape of the weights has to correspond to the way you access weights in the quantum node. In this case the weight matrix consists of 4 layers and 7 weights per layer.

Your comment actually made me aware that we only need 5 weights per layer in the `layer(v)`

function. It does not affect the code, but might be confusing. Will fix that!

I have already implemented it

and works great

thanks for your responses. that actually got me scratching my head for more than a week

is there a ground rule for weight initialization or it’s problem dependent?

@Maria_Schuld @josh @Christian_Gogolin

**Maria_Schuld**#7

This is one of many things that the research community has to find out, which is why PennyLane was written in the first place…