# Generalization in QML from few training data: predict test label

I am trying to predict the labels of test data (digit numbers 0 or 1) for this code about QML from few training data “Generalization in QML from few training data | PennyLane Demos” . How can i do it correctly?

I’m not sure I understand your question . Are you trying to replicate the results in the demo?

The code has been written very well. I can get the accuracy. But i want to predict the label of the test data. For example if this is the list of true label of test data y_test = [0, 1, 0, 0, 1, 1], and the predicted labels are y_pre = [0, 1, 1, 0, 0,1]. Then i say accuracy is 4/6. I don’t know how exactly to use accuracy to predict the labels. But i know if i have the predicted labels i can calculate the accuracy.

Dear Isaac,

By adding this line of code:
predicted_labels = (compute_out(weights, weights_last, x_test, y_test) > 0.5).astype(int)
but i am still not sure if it is true. I got this as output {‘y_pred’: Array([1, 1, 1, 1, 1, 1, 1, 1, 1, 1], dtype=int64), ‘y_test’: Array([0, 0, 1, 0, 1, 0, 1, 0, 1, 0], dtype=int64), ‘test_acc’: Array(1., dtype=float64)} for n_test = 10 and n_train=40. The test_acc is 1 but the predicted labels are something else.

I think the code does this already!

``````        # compute accuracy and cost on testing data
test_out = compute_out(weights, weights_last, x_test, y_test)
test_acc = jnp.sum(test_out > 0.5) / len(test_out)
test_acc_epochs.append(test_acc)
test_cost = 1.0 - jnp.sum(test_out) / len(test_out)
test_cost_epochs.append(test_cost)
``````

That’s right above the last `NOTE` at the end of the tutorial. Let me know if this helps!

Thanks Isaac. You’re right. I did a small mistake

what is the reason behind using three ising xx, yy, and zz gate in the code? As far as i remember, Ising gates are used for studying the behavior of magnetic spins in a lattice. They can simulate the interaction between spins.

what is the reason behind using three ising xx, yy, and zz gate in the code?

You are right in that these interactions are typically used to model spin systems. In this application, you can think of them being employed as two-qubit parameterized gates . Nothing more complicated than having an ansatz that has some implicit “interactions” between neighbouring qubits (which makes a lot of sense for convolutional layers).

Thanks a lot again!

1 Like