I have spatial and temporal data for my eye tracking sessions. I have my x and y positions for the pupils to calculate the distance between two pupil movements in pixels but I don't know how to convert it into angles.
I assume you take the pixel coordinates w.r.t. your eye tracking camera? In this case you should have done some calibration which provides the calibration function to convert the according xET_px/yET_px coordinates in the eye tracker camera reference to some x/y coordinate in degrees/pixels taking the viewing distance into account. Normally, the according data is stored together with the raw pixel coordinates you are using.
Thank you for your answer. I have done the calibration process. I used eye center (x,y,z) and gaze point (x,y,z) to calculate the gaze vector which I'm not sure if theoretically is correct or not . ( The coordinates are in world camera view) but my raw data already provided me with normalized gaze vectors ( bottom left being (0,0) and top right being (1,1)) but the normalized gaze vector I have is a 3d vector and I don't know if I use this one how I'm going to change it into pixels. I know the pixel resolution in the world but what should I multiply my normalized gaze vetcor z axis by?
What kind of eye tracker do you use? It might be that you have a rotation vector (would explain the z-coordinate). However, if you have normalized coordinates, did a correct calibration and added the viewing distance & display size to the calibration program, then these coordinates are normalized w.r.t. your calibration area (aka stimulus display). In this case you simply can convert them by multiplying your screen resolution with the normed coords (e.g. (1,1) would be equivalent to (1920px.1200px) ).
However, this statement is done without knowing your eye tracker and might be completely wrong depending on what your ET stores (here the manual might help).
From what I have tested so far with Pupil Labs Software it's quite accurate. But I used a metal rod to fix my eye camera in front of the eye which can move in case the person wearing the tracker moves sharply so 3D printed model would be a better choice. In general I put the Pupil labs DIY guide and two other papers together to build the finalized eye tracker and used an ELP camera for the eye tracker.
I am also working on a similar project. I have created a central stimulus at center of screen and then created different saccadic target points at 10 degrees to right, 10 degrees top, 10 degrees left and 10 degrees bottom. How do I plot these values in matlab ? I mean i cannot simply write data points from -10 to +10 and plot it. How do we plot such data ? -10 may indicate both (-10 to left or -10 to bottom), so how can i plot accordingly ? Please i really need suggestions. David Jule Mack Nahal Norouzi