Abstract
Palpation and haptic feedback are vital for improving surgical precision, safety and decision-making in robotic-assisted surgery (RAS). Current RAS systems largely rely on visual input, lacking tactile feedback and autonomy, which increases training demands and limits scalability. The absence of touch impairs tissue characterization, complicates diagnostics and elevates the risk of unintended damage. Integrating haptics, through force sensors, tactile interfaces and emerging audio-based tools, has shown promise in restoring a sense of touch, particularly when enhanced by artificial intelligence and multimodal feedback systems. These technologies will enable interpretation of real-time data, improved tissue discrimination and reduced application of force, especially valuable in minimally invasive procedures. While progress has been made challenges remain in sensor miniaturization, biocompatibility and system integration; however, achieving semiautonomous and fully autonomous RAS requires intelligent sensing platforms combined with AI-driven analytics, and feedback mechanisms that approach human tactile perception. As tactile simulation technologies evolve, future surgical robots will operate with greater autonomy, improved accuracy and broader global accessibility. The field is moving toward a new paradigm: surgical robots as intelligent, adaptive systems capable of performing or assisting procedures collaboratively or independently, using real-time sensing and control to optimize outcomes and reduce reliance on human expertise.