# Using SciPy functions for optimization

I played a little with the plugins an hybrid computation demo and tried to write a similar circuit, where the cost function is computed using a scipy function. In my circuit there is a QNode with three wires and two beamspliters:

dev_fock = qml.device("strawberryfields.fock", wires=3, cutoff_dim=2)

@qml.qnode(dev_fock, diff_method="parameter-shift")
def photon_redirection2(params):
qml.FockState(1, wires=0)
qml.FockState(0, wires=1)
qml.FockState(0, wires=2)
qml.Beamsplitter(params[0], params[1], wires=[0, 1])
qml.Beamsplitter(params[2], params[3], wires=[0, 2])
return qml.expval(qml.NumberOperator(1)), qml.expval(qml.NumberOperator(2)), qml.expval(qml.NumberOperator(0))


As you see, I measure \hat{n}_1 and \hat{n}_2, and I decided the cost function will maximize the entropy of these two values:

def cost1(params):
return -entropy(photon_redirection2(params)[0:2])


Here I usedscipy.stats.entropy. This attempt ended with error:

TypeError: ufunc 'entr' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''


This is presumably related to Autograd?

However, manually computing the entropy works just fine.

def cost2(params):
ret = photon_redirection2(params)[0:2]
norm_ret = ret/np.sum(ret)
return -np.sum([p*np.log(p) if p != 0 else 0 for p in ret])
1 Like

Hi @ofir.arzi,
Yes, this problem is related to Autograd.
PennyLane provides a wrapped version of NumPy, so the users can benefit from automatic differentiation, and the SciPy library uses internally the numpy version without this feature.
When you use the cost2 function no type-casting is attempted or necessary.