Abstract
Developing intelligent robots with integrated sensing capabilities is critical for advanced manufacturing, medical robots, and embodied intelligence. Existing robotic sensing technologies are limited to recording of acceleration, driving torque, pressure feedback, and so on. Expanding and integrating with the multimodal sensors to mimic and even surpass the human feeling is substantially underdeveloped. Here, we introduce a printed soft human-machine interface consisting of an e-skin-enabled gesture recognitions with feedback stimulus and a soft robot with multimodal perception of contact pressure, temperature, thermal conductivity, and electrical conductivity. The sensing e-skin with adaptive machine learning was able to decode and classify the hand gestures with re-wearable convenience and individual's differences. The soft interface provides the bidirectional communications between robotics and human bodies in the close-loop. This work could substantially extend the robotic intelligence and pave the way for more practical applications.