Mobile eye-tracking system records pilot gaze during flight simulations to study cockpit workload and information-seeking behavior. Data analysis reveals patterns of visual attention linked to task complexity, informing design of adaptive cockpit interfaces. Field tests show the platform collects high-fidelity gaze data without interfering with pilot performance.">
As part of our research on multimodal analysis and visualization of activity dynamics, we are exploring the integration of data produced by a variety of sensor technologies within ChronoViz, a tool aimed at supporting the simultaneous visualization of multiple streams of time series data. This paper reports on the integration of a mobile eye-tracking system with data streams collected from HD video cameras, microphones, digital pens, and simulation environments. We focus on the challenging environment of the commercial airline flight deck, analyzing the use of mobile eye tracking systems in aviation human factors and reporting on techniques and methods that can be applied in this and other domains in order to successfully collect, analyze and visualize eye-tracking data in combination with the array of data types supported by ChronoViz.