Abstract
The inferior parietal lobule supports action representations that are necessary to grasp and use objects in a functionally appropriate manner [S. H. Johnson-Frey, Trends Cogn. Sci. 8, 71-78 (2004)]. The supramarginal gyrus (SMG) is a structure within the inferior parietal lobule that specifically processes object-directed patterns of manipulation during functional object use. Here, we demonstrate that neural representations of complex object-directed actions in the SMG can be predicted by a linear encoding model that componentially builds complex actions from an empirically defined set of kinematic synergies. Each kinematic synergy represents a unique combination of finger, hand, wrist, and arm postures and movements. Control analyses demonstrate that models based on image-computable similarity (AlexNet, ResNet50, VGG16) robustly predict variance in visual areas, but not in the SMG. We also show that SMG activity is specifically modulated by kinematic (as opposed to visual) properties of object-directed actions. The action-relevant, as opposed to visually relevant, nature of the representations supported by the SMG aligns with findings from neuropsychological studies of upper limb apraxia. These findings support a model in which kinematic synergies are the basic unit of representation, out of which the SMG componentially builds object-directed actions. In combination with other findings [Q. Chen et al., Cereb. Cortex 28, 2162-2174 (2018)], we suggest that kinematic synergies are related to complex object-directed actions in a similar way to how articulatory and voicing features combine to form phonological segments in spoken language production.