Vision Feedback Control for the Automation of the Pick-and-Place of a Capillary Force Gripper

毛细力夹爪拾取放置自动化的视觉反馈控制

阅读:1

Abstract

In this paper, we describe a newly developed vision feedback method for improving the placement accuracy and success rate of a single nozzle capillary force gripper. The capillary force gripper was developed for the pick-and-place of mm-sized objects. The gripper picks up an object by contacting the top surface of the object with a droplet formed on its nozzle and places the object by contacting the bottom surface of the object with a droplet previously applied to the place surface. To improve the placement accuracy, we developed a vision feedback system combined with two cameras. First, a side camera was installed to capture images of the object and nozzle from the side. Second, from the captured images, the contour of the pre-applied droplet for placement and the contour of the object picked up by the nozzle were detected. Lastly, from the detected contours, the distance between the top surface of the droplet for object release and the bottom surface of the object was measured to determine the appropriate amount of nozzle descent. Through the experiments, we verified that the size matching effect worked reasonably well; the average placement error minimizes when the size of the cross-section of the objects is closer to that of the nozzle. We attributed this result to the self-alignment effect. We also confirmed that we could control the attitude of the object when we matched the shape of the nozzle to that of the sample. These results support the feasibility of the developed vision feedback system, which uses the capillary force gripper for heterogeneous and complex-shaped micro-objects in flexible electronics, micro-electro-mechanical systems (MEMS), soft robotics, soft matter, and biomedical fields.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。