I am researching quantum kernel methods for classification tasks, focusing on mapping classical data into high-dimensional Hilbert spaces using parameterized quantum circuits. My goal is to design quantum feature maps that not only leverage quantum parallelism but are also robust against noise and decoherence inherent in current quantum hardware.
I am looking for theoretical frameworks, empirical studies, or benchmark experiments that address these challenges and offer guidelines for practical implementations in real-world high-dimensional datasets.