1

Reifying social movement trajectories

CHI 2013Social movement trajectories are visualized as interactive path animations, allowing analysts to track group mobilization over time and geography. By mapping social media streams onto a dynamic spatial interface, the system highlights hotspots of activity and narrative shifts. Evaluation with activists shows improved understanding of movement evolution and identification of influential events.

A paper-digital interface for information capture and display in time-critical medical work

PervasiveHealth 2012Paper-digital interface captures handwritten medical notes during emergencies and displays patient data on tablets instantly, improving coordination and reducing errors under pressure. The system integrates pen-based input with live digital visualization, ensuring critical information is always visible to care teams. Simulation evaluations demonstrate significant speed and accuracy gains compared to manual record-keeping.

A pen-based toolkit for authoring collaborative language activities

CSCW 2012Pen-based toolkit enables educators to create collaborative language-learning exercises by combining paper worksheets with digital pens, automatically syncing handwritten responses to shared tablets. Instructors define prompts on paper; student answers are captured digitally for group review, fostering interactive peer feedback. Usability tests show increased engagement and seamless transition between analog and digital environments.

Digital pen and paper practices in observational research

CHI 2012Digital pen-and-paper system records researchers’ handwritten field notes and automatically tags them with audio/video captured during observations, enabling seamless review of multimodal data. The system links ink strokes to contextual media, improving recall and reducing manual transcription efforts. Field studies show significant increases in data accuracy and efficiency compared to traditional note-taking methods.

Let's look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck

ETRA 2012Mobile eye-tracking system records pilot gaze during flight simulations to study cockpit workload and information-seeking behavior. Data analysis reveals patterns of visual attention linked to task complexity, informing design of adaptive cockpit interfaces. Field tests show the platform collects high-fidelity gaze data without interfering with pilot performance.

Microanalysis of active reading behavior to inform design of interactive desktop workspaces

ITS 2012Detailed study of reading activities uses video and gaze tracking to capture how users annotate, scroll, and reference documents on desktop displays. Analysis identifies patterns such as frequent context-switching and multitasking, guiding design of interactive workspaces that integrate pen, touch, and keyboard inputs. Recommendations include adaptive layouts and gesture shortcuts to reduce cognitive load during active reading sessions.

Multimodal prediction of expertise and leadership in learning groups

ICMI 2012Multimodal system analyzes speech, gesture, and facial cues from group interactions to predict individual expertise and leadership roles in collaborative learning. By training machine learning models on synchronized audio-visual data, the approach identifies patterns of influence and knowledge sharing. Results show the model predicts leadership emergence with over 80% accuracy, guiding interventions for effective team facilitation.

TAP & PLAY: an end-user toolkit for authoring interactive pen and paper language activities

CHI 2012The TAP & PLAY toolkit allows educators to design interactive language exercises on paper, where learners tap printed prompts with a digital pen to receive instant audio feedback and gamified quizzes on tablet devices. The system bridges paper and digital learning by converting pen strokes into digital events, fostering active participation and immediate correction. Deployment in classrooms shows increased student motivation and measurable language gains.

Using overlays to support collaborative interaction with display walls

IUI 2012Overlays allow multiple users to annotate ultra-scale display walls simultaneously by projecting personalized transparent layers, enabling private notes and group discussion without obscuring base content. Each participant’s input appears on separate virtual layers, reducing interference and enhancing coordination. User feedback indicates overlays improve task parallelism and decrease communication overhead in collaborative teams.

PaperSketch: a paper-digital collaborative remote sketching tool

IUI 2011PaperSketch bridges paper and digital whiteboards for remote teams by capturing sketches on paper and streaming them live to a shared canvas. Low-latency ink recognition ensures that collaborators see each other’s drawings in real time, enabling fluid, co-located feels despite physical distance. User studies indicate increased creativity and coordination compared to voice-only collaboration.