Lightning.kokkos still a valid plug-in?

Hi,

I just wanted to confirm if lightning.kokkos is still an active plug-in for pennylane.

I see the blog-post with installation but it is not listed on the plugin page.

Blog-post: PennyLane goes Kokkos: A novel hardware-agnostic parallel backend for quantum simulations | PennyLane Blog

Official Plug-In List: Plugins and ecosystem | PennyLane

Looking forward to getting some clarity over its current status. Thank you! :slight_smile:

Hi @kamzam,

Yes! lightning.kokkos is still an active plugin. Itโ€™s not yet on the PennyLane Install page but it is indeed an active plugin.

You can install it by running pip install pennylane-lightning-kokkos

Let me know if you have any additional questions!

Edit: If you just run pip install pennylane-lightning-kokkos it will work, but only with CPU (OpenMP). To have it working with GPU devices, for example, you would need to build from scratch so the compilation can configure everything regarding your local hardware. The full installation instructions are here in the documentation.

2 Likes

Hi @CatalinaAlbornoz,

Thank you for confirming, I will install it and reach out again if I face issues with it.

Hi @CatalinaAlbornoz ,

Doing a pip install using Anaconda is giving me the following error.

Can you kindly identify what might be causing it and a solution for successful installation.

Hey @kamzam , could you please include the things above where the screenshot cuts off? :sweat_smile: Weโ€™re missing parts of the output that would be helpful here.

Hi @Ivana_at_Xanadu,

PFA complete error output on pip install command for kokkos.

Environment Information: Basic python 3.8 with pennylane-lightning before running pip command for kokkos. Just in case that information helps as well.

kokkos_pip_install_error.txt (9.7 KB)

Hi @kamzam

Unfortunately we do not currently release prebuilt wheels for Lightning Kokkos on Windows. If you have access to WSL on Windows, you can install the pre-built Linux versions with OpenMP support directly using pip.

If you wish to manually build the project under Windows, I suggest reading the installation guide here. Since Windows natively provides a much older version of OpenMP than Linux, you may have trouble compiling with the default backend, in which case you can attempt to use the Kokkos threads backend with -DKokkos_ENABLE_THREADS=1. Though, once again, this may not be fully supported under Windows or by Kokkos as well as the OpenMP backend.

Otherwise, if the above fails, you may be best to use the lightning.qubit device, as this will be guaranteed to work under Windows.

Hope this helps.

1 Like

Hello, what would be the gpu installation for Colab?

Hi @kevinkawchak ,

Here are the steps you can follow to use lightning.gpu on Google Colab:

  1. Open a new notebook
  2. In the navigation bar at the top go to Runtime โ†’ Change runtime type
  3. Choose a GPU
  4. In the first cell of the notebook run !pip install pennylane custatevec-cu11 pennylane-lightning-gpu
  5. Run a simple code (such as the code below) to test that everything works
import pennylane as qml

dev = qml.device("lightning.gpu", wires=2)

@qml.qnode(dev)
def circuit():
  qml.Hadamard(wires=0)
  qml.CNOT(wires=[0,1])
  return qml.expval(qml.PauliZ(0))

circuit()

Note that not all GPUs are compatible with lightning.qubit so if you have questions about compatibility please let us know.

Also please note that the code above will probably run faster using lightning.qubit on a CPU that using lightning.gpu on a GPU. For computations under 20 qubits I recommend using lightning.qubit.

Finally, remember that Colab only gives you access to limited GPU resources so be careful so that they donโ€™t limit your GPU capacity.

I hope this helps!

Hello, is this the only code needed in Colab to access lightning.kokkos CPU/TPU/GPUs?

CPU/TPU

!pip install pennylane pennylane-lightning-kokkos

GPUs

!pip install pennylane custatevec-cu11 pennylane-lightning-kokkos

Hi @kevinkawchak, I donโ€™t know whether pennylane-lightning-kokkos works on Colab. But please let us know if you run into any issues.