A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used here for continuous and mobile monitoring of a subject's point of regard on the surrounding environment. We combine gaze tracking and hand gesture recognition to allow a subject to interact with objects in the environment by gazing at them, and controlling the object using hand gesture commands. The gaze tracking glasses was made from low-cost hardware consisting of a safety glasses' frame and wireless eye tracking and scene cameras. An open source gaze estimation algorithm is used for eye tracking and user's gaze estimation. A visual markers recognition library is used to identify objects in the environment through the scene camera. A hand gesture classification algorithm is used to recognize hand-based control commands. When combining all these elements the emerging system permits a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a command to it by performing a hand gesture (control). The system identifies the target for interaction by using visual markers. This innovative HCI paradigm opens up new forms of interaction with objects in smart environments.
Journal of E M D R Practice and Research, 2013, Vol 6, Issue 3