Anyone have any ideas regarding transfer learning for NLP using quantum-classical hybrid models? https://link.medium.com/xFZPuI7Ry9 is a good starting place, but what about models that have different input shapes such as BERT?
Hi! — Thanks for sharing. My colleagues have looked at Quantum-classical Transfer learning but more in the context of image processing. You can learn more about this in their paper:
Yes I have tried out the notebook pertaining to the same, even experimenting with some different models. I’m really interested in seeing how this can be applied to NLP.
I’m currently trying stuff out, starting with ELMo embeddings and later moving onto advanced models. I’ll keep updating this thread with any results I do get.
@Nicolas_Quesada is there any way I can contact you privately?
Hi @quantumHS — You can reach us at email@example.com.
Hi @quantumHS , I’m the author of that article on Medium. I’ll be happy to continue the conversation both online (here) and offline (firstname.lastname@example.org). IMO there must be a way to do e.g. named-entity recognition or POS tagging with a hybrid model, if that’s what you were asking between the lines.
Welcome and thanks for the link!