Hi guys,

I am working on classification problem with Gaussian state as input features.

```
dev = qml.device('strawberryfields.fock', wires=3, cutoff_dim=10)
@qml.qnode(dev)
def quantum_neural_net(pars,cm):
mean_vector=np.array([[0.0],
[0.0],
[0.0],
[0.0]])
cm=cm.reshape((4,4))
qml.GaussianState(cm, mean_vector, wires=range(2))
CVNeuralNetLayers(*pars,wires=range(3))
return qml.expval(qml.X(0))
```

The definitions of mean square error function and cost function I used are same as https://pennylane.ai/qml/demos/tutorial_variational_classifier.html

```
def square_loss(labels, predictions):
loss = 0
for l, p in zip(labels, predictions):
loss = loss + (l - p) ** 2
loss = loss / len(labels)
return loss
def cost(weights, cov_matrix, labels):
predictions = [quantum_neural_net(weights, f) for f in cov_matrix]
return square_loss(labels, predictions)
```

I use AdamOptimizer for optimization but return error "unhashable type: ‘numpy.ndarray’ "

```
init_pars=cvqnn_layers_all(n_layers=2, n_wires=3, seed= None)
opt = AdamOptimizer(0.01, beta1=0.9, beta2=0.999)
var = init_pars
for i in range(10):
var = opt.step(lambda v: cost(v, xtr,ytr), var)
#xtr,ytr are training dataset and training labels, respectively .
predictions_train = [np.sign(quantum_neural_net(var,cm)) for cm in xtr]
print(i + 1, cost(var, xtr, ytr))
```

As some solutions I found, I rewrited the cost function as:

```
def cost(weights, cov_matrix, labels):
predictions=np.zeros((len(labels),1), dtype=object)
for i in range(len(labels)):
predictions[i,0]= quantum_neural_net(weights, cov_matrix[i,:])
return square_loss(labels, predictions)
```

It can run now, but with warning " the output seems independent with input" and unexpected outputs :

```
1 [tensor(0.69090221, requires_grad=True)]
2 [tensor(0.69090221, requires_grad=True)]
3 [tensor(0.69090221, requires_grad=True)]
4 [tensor(0.69090221, requires_grad=True)]
5 [tensor(0.69090221, requires_grad=True)]
6 [tensor(0.69090221, requires_grad=True)]
7 [tensor(0.69090221, requires_grad=True)]
8 [tensor(0.69090221, requires_grad=True)]
9 [tensor(0.69090221, requires_grad=True)]
10 [tensor(0.69090221, requires_grad=True)]
```