Custom Cost Function With KerasLayer

Hello, I have seen this tutorial from Pennylane developers: https://pennylane.ai/qml/demos/tutorial_qnn_module_tf.html. Here they say that “Note that there are more advanced combinations of optimizer and loss function, but here we are focusing on the basics.”.

I am working on the autoencoder described in the “Continuous Variable Quantum Neural Networks” paper, therefore I need to define and use my own cost function. Is there a tutorial for this as well?

I have successfully created my quantum classical network but I am having trouble understanding how can I use my own cost function on a network created like described in the tutorial link I provided.

All ideas and suggestions appreciated, thanks!

Another quick question, can I train the classical encoder part separately and use its outputs as input to the quantum decoder? This is the only idea I have since I can’t train the network concatenated by KerasLayer.

Hi @brtymn,

You can create your own loss function by doing something like they explain in this site. As an example you can see that you can create a normal python function and simply call it as your loss function in model.compile:

def custom_loss_function(y_true, y_pred):
   squared_difference = tf.square(y_true - y_pred)
   return tf.reduce_mean(squared_difference, axis=-1)

model.compile(optimizer='adam', loss=custom_loss_function)

Regarding your second question, in principle you can do that but I don’t know if it will work. If you do try it let me know how it goes!