Abstract
Although eye cues have proven effective in simulated gaze contact, it remains unclear how and through which eye parameters people interpret and use such cues in real interactions. We developed a real-time dyadic paradigm that restricted interaction to the eye region and incorporated asymmetrical roles and temporally structured interaction phases. One partner (listener) experienced emotion-inducing sounds, while the other (observer), unaware of the timing or content, attempted to infer the listener's emotions solely from eye cues. Using a multi-measure approach, we analyzed fixation, blink, and pupil parameters in 25 dyads. Results showed that the parameters were shaped primarily by role- and phase-related processing demands rather than emotional valence. Blinks indexed role-specific processing demands, adapting to attentional priorities. Interpersonal blink synchronization decreased when partners' attentional goals diverged, underscoring its dependence on attentional coupling. Fixations reflected shared attention allocation across roles, marked by active visual exploration during mutual gaze phases. Pupil dilation signaled phase-dependent arousal and cognitive effort, particularly for observers. Together, these findings reveal differential sensitivity across eye parameters, extending from attention allocation to social cognition and interpersonal coordination, and highlight the need for multi-measure frameworks to model the eyes as an integrated system for real-time social communication.