Abstract
Live human faces, when engaged as visual stimuli, recruit unique and extensive patterns of neural activity. However, the underlying neural mechanisms that underly these live face-to-face processes are not known. We hypothesized that the neural correlates for live face processes are modulated by both spatial and temporal features of the live faces as well as visual sensing parameters. Hemodynamic signals detected by functional near infrared spectroscopy (fNIRS) were acquired concurrently with co-activated electroencephalographic (EEG) and eye-tracking signals during interactive gaze at a live human face or gaze at a human-like robot face. Regression of the fNIRS signals with two eye-gaze variables, fixation duration and dwell time, revealed separate regions of neural correlates, right supramarginal gyrus (lateral visual stream) and right inferior parietal sulcus (dorsal visual stream), respectively. These two areas served as the regions of interest for the EEG analysis. Standardized low-resolution brain electromagnetic tomography (sLORETA) was applied to determine theta (4 - 7 Hz) and alpha (8-13 Hz) oscillatory activity in these regions. Variations in oscillatory patterns corresponding to the neural correlates of the visual sensing parameters suggest an increase in spatial binding for the dorsal relative to the lateral regions of interest during live face-to-face visual stimulation.