Quantum-Efficient Word Embeddings

Master Thesis

Language models represent words and sentences as long vectors capturing their meaning, so-called “embeddings”. This representation has proven successful for various natural language tasks. However, encoding these high-dimensional embeddings onto a quantum computer is challenging and resource-intensive, often requiring a large number of qubits and complex gate operations. This project aims to develop new methodologies for representing information on quantum devices, focusing on achieving effective performance in natural language tasks while reducing hardware costs, such as minimizing the number of gates and qubits needed. By optimizing these representations, the goal is to make quantum computing more practical and efficient for language-related applications. This thesis requires a background in quantum computing / quantum information theory.