Abstract
Tactile displays are emerging as vital components across fields such as medical technology, assistive technology, virtual reality, infotech and gaming, yet their design remains hampered by an inadequate psychophysical foundation. In this work, we critically assess current tactile display research, uncovering gaps in spatial acuity data, and introduce robust, data-driven layout guidelines for vibrotactile displays (VTDs). We collected high-resolution vibrotactile data from 33 participants across five large-area body sites using a novel, fully automated experimental framework that employs Bayesian adaptive parameter estimation to generate continuous psychometric functions. This approach allows us to derive thresholds at any recognition rate, thereby overcoming the limitations of traditional, discrete measures. Our findings reveal that existing datasets are scattered and inconsistent, demonstrate a pronounced horizontal anisotropy especially near the body midline and expose a marked sensitivity gradient along the lower back. These insights provide a validated psychophysical basis for VTD development, paving the way for more reliable, user-centric designs in next-generation tactile interfaces.