@josh I am working on converting the transfer learning demo to Tensorflow. Could you possibly provide me with a very basic example of how to implement a quantum circuit in between two classical layers in Tensorflow? Thanks!
Hi @James_Ellis,
This can be a good case for using Keras in TensorFlow with KerasLayer
in PennyLane. The example in the docs defines a quantum circuit, creates a qml.qnn.KerasLayer
and adds it to before a classical layer.
Working further on this, the example under Usage Details
can give a good starting point for your specific case:
import pennylane as qml
import tensorflow as tf
import sklearn.datasets
n_qubits = 2
dev = qml.device("default.qubit", wires=n_qubits)
@qml.qnode(dev)
def qnode(inputs, weights):
qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
qml.templates.StronglyEntanglingLayers(weights, wires=range(n_qubits))
return qml.expval(qml.PauliZ(0)), qml.expval(qml.PauliZ(1))
weight_shapes = {"weights": (3, n_qubits, 3)}
qlayer = qml.qnn.KerasLayer(qnode, weight_shapes, output_dim=2)
clayer1 = tf.keras.layers.Dense(2)
clayer2 = tf.keras.layers.Dense(2, activation="softmax")
model = tf.keras.models.Sequential([clayer1, qlayer, clayer2])
data = sklearn.datasets.make_moons()
X = tf.constant(data[0])
Y = tf.one_hot(data[1], depth=2)
opt = tf.keras.optimizers.SGD(learning_rate=0.5)
model.compile(opt, loss='mae')
Also suggest checking out some relevant other threads on the forum, e.g., QNN KerasLayer with Model.
Hope this helps, let us know if you’d have further questions!
1 Like
Thanks for the help!
Can backpropagation instead of finite-difference be used with a KerasLayer?
Yep, with the latest version of PennyLane, the KerasLayer
class supports backprop
1 Like