Abstract
Visually-guided decisions are often modeled as bounded accumulation of sensory evidence, with ``ramping'' activity in frontoparietal cortex taken as a canonical signature of integration. Here we show that anterior inferotemporal cortex (TEa) implements an alternative integration code during object-based decisions. In macaques performing stochastic face categorization, TEa neurons tracked momentary feature fluctuations while concurrently encoding an accumulated decision variable as a running average of task-relevant evidence. This running-mean code increased category signal-to-noise without ramps, supported flexible amplification of task-relevant features, and showed a sharp collapse in sensitivity to new evidence after commitment. These signatures emerged concurrently with, or slightly earlier than, those in simultaneously recorded LIP neurons, arguing against inheritance from feedback. A simple subunit model combining momentary evidence with a running-average decision variable accounts for TEa dynamics. Our results identify a previously unrecognized format for evidence integration and reposition TEa as a decision-aware hub within the ventral visual pathway.