Semantic and fiducial-aided graph simultaneous localization and mapping (SF-GraphSLAM) for robotic in-space assembly and servicing of large truss structures

基于语义和基准点辅助的图同步定位与建图(SF-GraphSLAM)技术,用于大型桁架结构的机器人空间装配和维护

阅读:1

Abstract

This article proposes a method that uses information about modules and desired assembly locations within a large truss structure to create a semantic and fiducial aided graph simultaneous localization and mapping (SF-GraphSLAM) algorithm that is better tailored for use during robotic in-space assembly and servicing operations. This is achieved by first reducing the number of modules using a mixed assembly method vs. a strut-by-strut method. Then, each module is correlated to a visual tag (in this article, an AprilTag) to reduce the number of elements being observed further from the number of sub-struts in that module to a single AprilTag marker. Two tags are required to ensure proper deployment of most deployable modules. Subsequently, we are able to use semantic information about the desired transformation matrix between any two adjacent module AprilTags within the desired assembly structure. For our experimentation, we expanded a factor graph smoothing and mapping model and added the semantic information, looking at the smaller number of landmark AprilTags, with a camera representing the robot for simplicity. The mathematical approach to arrive at this new method is included in this article, as are simulations to test it against the state of the art (SOA) using no structural knowledge. Overall, this research contributes to the SOA for both general SLAM work and, more specifically, to the underdeveloped field of SLAM for in-space assembly and servicing of large truss structures. It is critical to ensure that as a robot is assembling the modules, each module is within the desired tolerances to ensure the final structure is within the design requirements. Being able to build a virtual twin of the truss structure as it is being assembled is a key tent pole in achieving large space structures.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。