We’re very excited to announce the release of Strawberry Fields version 0.14.0.
This release is focused on updating the "tf"
backend to support TensorFlow 2.0 and above. You can now use the features from TensorFlow 2 for building deep learning models in Strawberry Fields.
TensorFlow 2.0
Eager execution is now implemented by default in TensorFlow and thus also the default mode in the "tf"
backend in Strawberry Fields — there is no longer any need to create a tf.Session()
. You can now use tf.GradientTape()
for gradient calculations.
See Bermigrasi ke TensorFlow 2 | TensorFlow Core for help with migrating your TensorFlow 1 code to TensorFlow 2.
Below is an example for using TensorFlow 2.0 to train a variational photonic circuit:
Code example
eng = sf.Engine(backend="tf", backend_options={"cutoff_dim": 7})
prog = sf.Program(1)
with prog.context as q:
# Apply a single mode displacement with free parameters
Dgate(prog.params("a"), prog.params("p")) | q[0]
opt = tf.keras.optimizers.Adam(learning_rate=0.1)
alpha = tf.Variable(0.1)
phi = tf.Variable(0.1)
for step in range(50):
# reset the engine if it has already been executed
if eng.run_progs:
eng.reset()
with tf.GradientTape() as tape:
# execute the engine
results = eng.run(prog, args={'a': alpha, 'p': phi})
# get the probability of fock state |1>
prob = results.state.fock_prob([1])
# negative sign to maximize prob
loss = -prob
gradients = tape.gradient(loss, [alpha, phi])
opt.apply_gradients(zip(gradients, [alpha, phi]))
print("Value at step {}: {}".format(step, prob))
For more details and demonstrations of the new TensorFlow 2.0-compatible backend, see our optimization and machine learning tutorials.
The full release notes are available at Release Release 0.14.0 · XanaduAI/strawberryfields · GitHub.
As always, this release would not have been possible without all the help from our contributors:
@Tom_Bromley, @theodor, @josh, @nathan, Filippo Miatto, @Nicolas_Quesada, @antalszava, Paul Tan