Back propagation for keras layer

hi,
Merry Christmas!

  1. I would like to know if the keras layer supports back propgation with while using tf.tensor device and the qulacs device in version 13?
  2. can we use parallel=true with tensorflow to compute the gradients of multiple circuits?
  3. in tensorflow 2.4.0 this error occurs:

    but in tf 2.3.1 it works fine.
    thanks in advance

Hi @kareem_essafty,

  1. Yes, you should be able to use backpropagation for a KerasLayer consisting of tensorflow tensors for the classical part and qulacs device for the quantum part.
  2. Depends more specifically what you mean here. How does your question compare to this one for TorchLayer? (which should have similar restrictions as KerasLayer)
  3. This is hard to debug without further context (e.g., a minimal (non-)working example). The error seems to be thrown by Keras, and you say it doesn’t occur in earlier versions, so on a superficial level I would guess it is a bug in TF/Keras, but hard to know for sure without full code here