Printed sensing human-machine interface with individualized adaptive machine learning

具有个性化自适应机器学习的印刷传感人机界面

阅读:1

Abstract

Developing intelligent robots with integrated sensing capabilities is critical for advanced manufacturing, medical robots, and embodied intelligence. Existing robotic sensing technologies are limited to recording of acceleration, driving torque, pressure feedback, and so on. Expanding and integrating with the multimodal sensors to mimic and even surpass the human feeling is substantially underdeveloped. Here, we introduce a printed soft human-machine interface consisting of an e-skin-enabled gesture recognitions with feedback stimulus and a soft robot with multimodal perception of contact pressure, temperature, thermal conductivity, and electrical conductivity. The sensing e-skin with adaptive machine learning was able to decode and classify the hand gestures with re-wearable convenience and individual's differences. The soft interface provides the bidirectional communications between robotics and human bodies in the close-loop. This work could substantially extend the robotic intelligence and pave the way for more practical applications.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。