The emergence of assistive robots presents the possibility of restoring vital degrees of independence in daily living (ADL) activities for the elderly and the impaired. Although people can communicate their wishes in numerous ways such as bodily expressions or actions, linguistic patterns, and gaze-based implicit intention communication remains underdeveloped.

I'm focusing to develop a new, tacit, nonverbal communication paradigm for Human-Computer Interaction (HCI) is implemented based on the eye view. To achieve high-performance and robustness, conventional gaze detection technologies use clear infrared lights and high-resolution cameras. These systems, however, require complex modification and are thus limited in laboratory study and difficult to implement in practice.

I'm looking for follow up the Tobii Eye Tracker 5. The Tobii Eye Tracker 5 recently released and still undergoing research and still, I haven't found any research related to obtaining eye gaze-based implicit intention communication and human interaction.

Similar questions and discussions