I am not aware about eye-tracking in this context but there is big number of papers related to phonological processing using EEG and fMRI (or even simultaneous EEG-fMRI) for example:
Article Sensory processing of native and non-native phonotactic patt...
Article Acoustic-level and language-specific processing of native an...
By employing eye-tracking in studying phonological processes, researchers can gain valuable insights into the visual attention dynamics and cognitive mechanisms involved in perceiving and processing phonological information.
eye tracking allows to study auditory -visual integration or lack of it, in phonological processing. Studies are available in normal individuals as well as Hearing impaired.
Thank you for sharing that valuable insight on the use of eye tracking in studying auditory-visual integration in phonological processing. It's interesting to know that this technology enables researchers to investigate how individuals integrate auditory and visual information during language processing. I'm particularly intrigued by the potential application of eye tracking in studying both normal individuals and those with hearing impairments. Are there any specific findings or notable studies that have shed light on the role of auditory-visual integration in phonological processing? I would be interested to learn more about the outcomes of such research and how they contribute to our understanding of language processing in different populations.