Probabilities of a circuit aren't added to one

I’m running the following circuit on the strawberry fields platform and the probabilities it generates arent adding to one and I can’t figure out why.
The circuit:

eng = sf.Engine("fock", backend_options={"cutoff_dim": 2})
prog = sf.Program(4)
array = [0.] * 6
with prog.context as q:
    ops.Fock(1) | q[0]
    ops.Fock(1) | q[1]
    ops.Fock(0) | q[2]
    ops.Fock(0) | q[3]
    ops.Interferometer(U) | (q[0], q[1], q[2], q[3])
result = eng.run(prog)
state = result.state
probs = state.all_fock_probs()
probs = np.array(probs)
array[0] = probs[0,0,1,1]
array[1] = probs[0,1,0,1]
array[2] = probs[0,1,1,0]
array[3] = probs[1,0,0,1]
array[4] = probs[1,0,1,0]
array[5] = probs[1,1,0,0]

U is:
[[ 0.40364712+8.15600153e-01j 0.35110689+1.43525025e-01j
0.15986655-4.72290949e-02j 0.00723679-1.22329374e-02j]
[ 0.10394578-3.95706514e-01j 0.34043768+6.76229828e-01j
0.4594777 +2.10444020e-01j 0.06021116-1.98798347e-02j]
[-0.0547196 +3.78126539e-02j 0.12569266-5.05463349e-01j
0.3329969 +7.23569755e-01j 0.27692649+1.14698367e-01j]
[ 0.00723156+8.35107163e-04j -0.08869668+6.14897215e-02j
0.08223266-2.74960569e-01j 0.41948884+8.54378436e-01j]]

The array of probabilities it generates:
[1.126429679043999e-05, 0.0014664820957383993, 0.02522048151295811, 0.010012321153092966, 0.22563731287709546, 0.3744367053787543]

And the sum of all probs: 0.6367845673144297

Hi @segev, I think that your cutoff_dim here may be too low. I tried it for a different matrix U and increasing your cutoff_dim to 3 or 4 can make a huge difference. If the problem remains then can you please post the output of sf.about()?

Hi @CatalinaAlbornoz thanks for your answer;
The issue is that I’m interested only on outcomes with up to 1 photon in each mode, that’s why I’ve chosen cutoff_dim = 2.
The relevant probabilites aren’t supposed to be summed to 1 for higher cutoff_dim. Isn’t the platform renormalized the fock states coefficients based on this cutoff and therefore supposed to produce appropriate probabilities?

Hi @segev!

Having low cutoff dimension means your state representation is incomplete and, as such, probabilities will not sum up to one. With that perspective, renormalizing the coefficients is not correct because the outcome is not a true probability distribution — the probabilities you see are those related to the possible outcomes you can get from your state representation.

However, I see from your circuit that your initial state is comprised of 2 photons, i.e., a cutoff of 3 should be enough to represent your state if your interferometer is passive (it includes no squeezing). I was not able to check that because copying and pasting your U leads to a non-unitary matrix due to numerical errors.

Try

import numpy as np
from strawberryfields.decompositions import bloch_messiah

S = np.block([[U.real, -U.imag], [U.imag, U.real]])
O1, Sq, O2 =  bloch_messiah(S)

# evaluates to True if S is a passive transformation
print(np.allclose(Sq, np.identity(Sq.shape[0]))) 

and if it evaluates to True you know your transformation is passive. If not, you’ll have to increase your cutoff and the squeezing values on Sq could be used as a proxy to find an appropriate value.

2 Likes

Got it, thank you @Sebastian_Duque_Mesa! I’ll change my circuit accordingly.