Object discrimination using electrotactile feedback

利用电触觉反馈进行物体辨别

阅读:1

Abstract

OBJECTIVE: A variety of bioengineering systems are being developed to restore tactile sensations in individuals who have lost somatosensory feedback because of spinal cord injury, stroke, or amputation. These systems typically detect tactile force with sensors placed on an insensate hand (or prosthetic hand in the case of amputees) and deliver touch information by electrically or mechanically stimulating sensate skin above the site of injury. Successful object manipulation, however, also requires proprioceptive feedback representing the configuration and movements of the hand and digits. APPROACH: Therefore, we developed a simple system that simultaneously provides information about tactile grip force and hand aperture using current amplitude-modulated electrotactile feedback. We evaluated the utility of this system by testing the ability of eight healthy human subjects to distinguish among 27 objects of varying sizes, weights, and compliances based entirely on electrotactile feedback. The feedback was modulated by grip-force and hand-aperture sensors placed on the hand of an experimenter (not visible to the subject) grasping and lifting the test objects. We were also interested to determine the degree to which subjects could learn to use such feedback when tested over five consecutive sessions. MAIN RESULTS: The average percentage correct identifications on day 1 (28.5%  ±  8.2% correct) was well above chance (3.7%) and increased significantly with training to 49.2%  ±  10.6% on day 5. Furthermore, this training transferred reasonably well to a set of novel objects. SIGNIFICANCE: These results suggest that simple, non-invasive methods can provide useful multisensory feedback that might prove beneficial in improving the control over prosthetic limbs.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。