Tetronimos_learning.py

In initial data preparation, your code has

with engine:
Dgate(disps_alpha) | q[0]
Dgate(disps_beta) | q[1]

where disps_alpha and disps_beta are lists of length 7.
disps_alpha = tf.constant(
[alpha, -alpha, alpha, -alpha, 1.0j * alpha, -1.0j * alpha, 1.0j * alpha]
)

My understanding is that Dgate takes at most 2 parameters. What am I missing?

Hi @sophchoe — The second parameter of the displacement gate is optional.

I understand. My question was about this code. https://github.com/XanaduAI/quantum-neural-networks/blob/master/tetrominos_learning/tetrominos_learning.py

The parameter the code is passing for the Date is a list of length 7 instead of 1 or 2 values. Is there anyone who successfully ran this code, as far as you know?

disps_alpha = tf.constant(
[alpha, -alpha, alpha, -alpha, 1.0j * alpha, -1.0j * alpha, 1.0j * alpha]
)

with engine:
# State preparation
Dgate(disps_alpha) | q[0]
Dgate(disps_beta) | q[1]
# Sequence of variational layers
for i in range(depth):
layer(i)

Thank you.

Hi @sophchoe!

Thanks for checking out the code. One thing to be aware of is that the linked repository is intended to support this paper and the code is not actively maintained, requiring an older version of Strawberry Fields.

To answer your question, the Dgate() can be supplied with either one or two parameters, corresponding to a displacement and a phase. In this code, we are supplying just one parameter (the displacement). The reason you see disps_alpha and disps_beta as length-7 tensors is because they are being fed into the engine as a batch over the 7 input tetromino images (LOTISJZ). You can see in the engine.run line that we specify this batch behaviour using batch_size=num_images (where num_images = 7).

1 Like

Get it! Clarifies it! Thank you, Tom!

2 Likes