Vision-Based Grasping Method for Prosthetic Hands via Geometry and Symmetry Axis Recognition

基于几何和对称轴识别的假肢手视觉抓取方法

阅读:1

Abstract

This paper proposes a grasping method for prosthetic hands based on object geometry and symmetry axis. The method utilizes computer vision to extract the geometric shape, spatial position, and symmetry axis of target objects and selects appropriate grasping modes and postures through the extracted features. First, grasping patterns are classified based on the analysis of hand-grasping movements. A mapping relationship between object geometry and grasp patterns is established. Then, target object images are captured using binocular depth cameras, and the YOLO algorithm is employed for object detection. The SIFT algorithm is applied to extract the object's symmetry axis, thereby determining the optimal grasp point and initial hand posture. An experimental platform is built based on a seven-degree-of-freedom (7-DoF) robotic arm and a multi-mode prosthetic hand to conduct grasping experiments on objects with different characteristics. Experimental results demonstrate that the proposed method achieves high accuracy and real-time performance in recognizing object geometric features. The system can automatically match appropriate grasp modes according to object features, improving grasp stability and success rate.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。