Quantum Metric Learning Inquiries

Hi! I’m trying to implement quantum metric learning (https://pennylane.ai/qml/demos/tutorial_embeddings_metric_learning.html) with MNIST datasets instead of pictures of ants and bees. Currently I’m using only 0 and 1 images to see whether the metric learning can distinguish the images clearly. In so doing, I’m using 784 classical parameters and 12 quantum parameters. But I’m stuck on a point where my classical parameter doesn’t seem to change after 100 iterations. (Also just to make clear: So here we are training both classical neural network (only the very last layer) and quantum variational circuits, right? )

image

Hey @Leeseok_Kim,

cool idea to try that. Yes, you are training the classical and “quantum” parameters together.

It’s very hard to say why training does not update parameters any more without doing proper research: Maybe you have reached a (local?) minimum, or a region with very small gradients?

These questions might help:

Have you looked at the “intermediate representation” of the data, like done in the paper? Does it look like the classes are separated? What is your loss at the point of convergence? Is it still very high? What is your classification accuracy?

I think this is part of the research fun, to slowly uncover whether the code, the model or quantum physics in general is stopping the training here…

Thanks, I’ll look into it. Also, I was wondering why are you using 512 parameters in the last layer instead of reducing the layer size much lesser than 512 via classical neural network and then use much lesser classical parameters?

This was just the most straight-forward option here… The demonstration had the aim to show that it is possible, not so much what the best solution is. But to make runtimes shorter this could be a good idea!

1 Like