Would it be theoretically possible to take eye tracking (gaze position) data and feed it into some kind of wearable device/virtual reality setup that would cause the whole body or parts of the body to feel as if they are moving in a direction that's counter to how the eyes are moving? If so, what kind of technology would be the best way to do this?