I did something similar. I pretrained a classical model (modelC), then i saved it. I then re-loaded for a different instance, I froze the weights of the model and i added a Qnode and a final classical decision layer thus obtaning a new hybrid model (modelH. The process it:
- Load the model
modelC = load_model(“model.h5”)
- Stop the training
model.trainable = False
- Include Qnode and final decision layer thus making a Hybrid model
modelH = tf.keras.models.Sequential([model,qlayer,clayerD])
- Trained the new Hybrid model.
Sidenote: Here the pretrained model contains 3 layers (clayer1.clayer2,clayer3) so instead of using the model.trainable = False
command you can also freeze the weights using this commands:
clayerM.trainable = False
clayerF.trainable = False
clayerF.trainable = False
It is very important to remember that the last classical layer before the Qnode must have some neurons as qubits in the Qnode (this is why i call it Feeding layer) so step 3. can be something like:
modelH = tf.keras.models.Sequential([model,Feedinglayer,qlayer,clayerD])
and it make sense for the feeding layer:
- not to pretrain it
- use a custom trigonometric activation function (In my case i use standar scaler to map my data from 0 to pi, so for the feeding layer activation function i use a custom sigmoid, tan etc that has the same range )