At the lab we've been discussing how we can use eye tracking methodologies for our research projcts about 'mutual modeling'. This lead me to a quick web of science/google scholar scan of what is available concerning the use of this technique to study collaborative interfaces usage. I went forward by looking at whether this can be used in mobile settings. With regard to mobile context analaysis, I ran across this intriguing project at igargoyle: Building a lightweight eyetracker by Jason S.Babcock & Jeff B. Pelz from Rochester Institute of Technology:
(picture taken from the article)
Eyetracking systems that use video-based cameras to monitor the eye and scene can be made significantly smaller thanks to tiny micro-lens video cameras. Pupil detection algorithms are generally implemented in hardware, allowing for real-time eyetracking. However, it is likely that real-time eyetracking will soon be fully accomplished in software alone. This paper encourages an “open-source” approach to eyetracking by providing practical tips on building lightweight eyetracking from commercially available micro-lens cameras and other parts. While the headgear described here can be used with any dark-pupil eyetracking controller, it also opens the door to open-source software solutions that could be developed by the eyetracking and image-processing communities. Such systems could be optimized without concern for real-time performance because the systems could be run offline.
This seems to be interesting but having three lightweight devices like this would be really hard. Another cheap solution can be found in this paper: Building a Low-Cost Device to Track Eye Movement by Ritchie Argue, Matthew Boardman and Jonathan Doyle, Glenn Hickey:
we examine the feasibility of creating a low-cost device to track the eye position of a computer user. The device operates in real-time using prototype Jitter software at over 9 frames per second on an Apple PowerBook laptop. The response of the system is sufficient to show a low-resolution cursor on a computer screen corresponding to user’s eye position, and is accurate to within 1 degree of error. The hardware components of the system can be assembled from readily available consumer electronics and off-the-shelf parts for under $30 with an existing personal computer.
Now, if we want to use this to study collaborative software, it's not easy, as attested by this paper: Using Eye-Tracking Techniques to Study Collaboration on Physical Tasks: Implications for Medical Research by Susan R. Fusell and Leslie D. Setlock. The paper discusses eye-tracking as a technique to study collaborative physical tasks, namely a surgical team might collaborate to save treat a patient. They bring forward the tremendous potential as a tool for studying collaborative physical tasks and highlight some limitations:
The eye tracker typically can’t be calibrated correctly for a sizeable proportion of participants (up to 20%). Furthermore, the head-mounted device may slip over the course of a task, requiring recalibration to avoid data loss. This creates problems in collecting high-quality data. (...) Gaze data also requires considerable effort to process. (...) manual coding could quickly become unwieldy in a setting with many, many possible targets