Possible to create a QNN like classical one?

Hi @SuFong_Chien,

We do not have any dedicated features yet for saving/loading models. Instead, users can leverage the tools from TensorFlow directly.

Perhaps the following guide can help you with the task of loading saved/checkpointed models using Tensorflow.

Hi Tom

When I come back to this, it is still a classification problem. My output is with says [-0.358118 -0.0280287 -0.30103 -0.129338]. They are not discrete like example above. They are continuous real values. Not possible for doing that. I have also looked into the mentioned transfer learning, again it is a classification example. Any suggestions. Thanks.

When I come back to this, it is still a classification problem. My output is with says [-0.358118 -0.0280287 -0.30103 -0.129338]. They are not discrete like example above. They are continuous real values. Not possible for doing that. I have also looked into the mentioned transfer learning, again it is a classification example. Any suggestions. Thanks.

Hi @SuFong_Chien, we’re sorry for not getting back to you sooner! I missed this comment.

If I understand correctly, you would like to predict an output label for one of four classes? Given the vector [-0.358118 -0.0280287 -0.30103 -0.129338], this could be achieved by applying a softmax (e.g., using tf.nn.softmax) and finding a resulting probability vector. To find the prediction, you could then do tf.math.argmax or alternatively sample from the vector.

However, one thing that is not clear to me is that the values you have are negative. If I recall correctly, these values should be associated with probabilities of getting certain numbers of photons in each mode, so we expect them to be non-negative and sum to less than one. If that were the case, instead of a softmax, we could also just renormalize.

Hope this helps answer your question, and let us know if you have any more!

Thanks