This poster abstract presents the first part of a study concerning the use of information about gaze, grip and gesture to create non-command interaction. The experiment reported here seeks to establish the occurrence of patterns in nonverbal communication, which may be used in an activity aware setup that seeks to adjust to the individual’s intentions and attention. Results indicate that basic patterns of facial direction and grip are correlated with intention and/or attention, and an analysis of gesture patterns is currently being performed.
Main Research Area:
12th ACM International Conference on Ubiquitous Computing, 2010