Lightning gpu with TensorFlow interface

Hi @Bnesh ,

I’m coming back regarding your question about your issues with lightning.gpu. In principle there shouldn’t be any issues. However this can depend on what exactly you’re running. Have you set max_diff=2 when instantiating your QNode? You can see an example in this post here, where they calculate second order derivatives in two different ways and then compare them.

I hope this helps you!