Abstract
In our environments, we encode surrounding spatial information using egocentric (subject-to-object) and allocentric (object-to-object) reference frames. Spatial encoding, however, occurs in an environment populated by more than just “objects” but also by people, and this social information can have a significant impact on our spatial memory. Here, we investigated how implicit social information influences spatial encoding by designing a study with an explicit spatial task and implicit social cues. Participants performed a task where they memorized triads of geometric objects and provided egocentric and allocentric judgments of relative distance. Each object was positioned in front of pairs of social (virtual humans) and non-social stimuli (lamps and chairs, as control conditions). These stimuli, irrelevant to the spatial task, could be at different proxemic distances (intimate, personal, and social), and with mutual and non-mutual gaze (Facing/Not-Facing), in the case of virtual humans and chairs. A questionnaire assessing empathic disposition was also administered. The overall pattern of results showed that the egocentric processing was facilitated over the allocentric one when the social cues were clear and allowed an easy social categorization. Notably, no such advantage emerged when social categorizations required further processing to understand social relations. The empathic disposition was also associated with the spatial performance. In conclusion, our findings demonstrate that even when irrelevant to a task, social information, defined by nonverbal signals (proxemics distance and gaze), implicitly affects the way we represent our surrounding environment. This highlights the intertwined nature of spatial cognition and social processes in everyday life.