Abstract
The rapid growth of flexible Internet of Things devices and multisensory human-computer interaction necessitates advanced systems capable of multimodal data sensing, processing, and feedback in an intelligent and energy-efficient manner. However, conventional architectures face limitations in energy inefficiency and interface mismatch due to the fragmented architecture of sensing, processing, and feedback units. Inspired by biological sensory systems, we introduce an MXene-based flexible dual-mode sensing-processing-visualizing integrated system into a single wearable device through a hierarchical MXene platform combining piezoelectric nanogenerators (mechanosensation), optoelectronic synapses (visual processing), and color-shifting quantum dot light-emitting diodes (optical feedback) with optimized interfaces. The wearable integrated system demonstrates tactile-visual signal recognition capabilities, biological self-protection behavior to adapt to environmental stimuli, dynamic trajectory recognition, and spatial positioning for motion recognition, which are highly desired in multisensory interaction. The bio-inspired material-architecture-function co-design strategy and integrated system promote the development of wearable neuromorphic hardware, edge computing, and intelligent human-machine interaction.