Adding regularization to KerasLayer

Hello, I would like to add an L2 regularization to the weights of a KerasLayer and was wondering if you could give me some tips about how to implement this. For context, I’m training a KerasLayer in Tensorflow that represents a quantum kernel of the form K(x,w) and need to regularize a Hinge loss for a classification problem with the L2 norm of w. I was hoping to have something similar to a “kernel_regularizer” option for Dense and Conv layers in Tensorflow but it seems that this feature is currently unavailable for KerasLayers and would be very useful to have.

Hi @Roberto, welcome to the forum!

You could try adding an L2 term on the weights to the optimized loss. e.g. let ws be the gate parameters and L the original loss, the new loss with weight regularization would be L_reg = L + reg_coef*np.mean(ws**2) .

Would this be what you were looking for?

Please let me know if this helps!

Hi @CatalinaAlbornoz,

Thank you for your prompt response.

Sure, I could do what you suggest but I was hoping to take advantage of the high-level functionality of Tensorflow with model.compile and model.fit, where you don’t need to explicitly enter the loss or the regularization formulae.

At any rate, I did find a way of enabling regularization for the Keras Layer. Here is how:

  1. I copied your code for the class definition of KerasLayer to create a new class, KerasLayerRegularized.

  2. At the top of the code I added:
    from tensorflow.keras import regularizers

  3. At the end of the def _init_ section, I added the following line of code:
    self.kernel_regularizer = regularizers.get(kernel_regularizer)

  4. In the def built section, I replaced:
    self.qnode_weights[weight] = self.add_weight(name=weight, shape=size, **spec)
    with
    self.qnode_weights[weight] = self.add_weight(name=weight, shape=size, regularizer=self.kernel_regularizer, **spec)

  5. Finally, in my own code, in the line where I create the Keras Layer I have:

     qlayer = KerasLayerRegularized(circuit,weight_shapes,
                                    weight_specs={"weights": {"initializer":"random_normal"}},
                                    kernel_regularizer=tf.keras.regularizers.L2(1e-1),
                                    output_dim=outdim)
    

It worked nicely. Perhaps other people will find it useful to enable the kernel_regularizer option for the KerasLayer as I did here, or even better, if possible, it could be an added feature in future versions.

Glad you got it working @Roberto!

If you would like to open an issue on our GitHub page, that will gelp inform the developers on any feature requests, and also make the request visible to potential contributors :slightly_smiling_face:

Thank you for posting the solution here @Roberto! Would you like to open the issue on GitHub for requesting this feature? If not it’s ok, just let us know.

Thank you both. I submitted a feature request and provided the link to this conversation: https://github.com/PennyLaneAI/pennylane/issues/1777

Awesome, thanks @Roberto!