Occupancy-Aware Neural Distance Perception for Manipulator Obstacle Avoidance in the Tokamak Vacuum Vessel

托卡马克真空容器中机械臂避障的占用感知神经距离感知

阅读:3

Abstract

Accurate distance perception and collision reasoning are crucial for robotic manipulation in the confined interior of tokamak vacuum vessels. Traditional mesh- or voxel-based methods suffer from discretization artifacts, discontinuities, and heavy memory requirements, making them unsuitable for continuous geometric reasoning and optimization-based planning. This paper presents an Occupancy-Aware Neural Distance Perception (ONDP) framework that serves as a compact and differentiable geometric sensor for manipulator obstacle avoidance in reactor-like environments. To address the inadequacy of conventional sampling methods in such constrained environments, we introduce a Physically-Stratified Sampling strategy. This approach moves beyond heuristic adaptation to explicitly dictate data distribution based on specific engineering constraints. By injecting weighted quotas into critical safety buffers and enforcing symmetric boundary constraints, we ensure robust gradient learning in high-risk regions. A lightweight neural network is trained directly in physical units (millimeters) using a mean absolute error loss, ensuring strict adherence to engineering tolerances. The resulting model achieves approximately 2-3 mm near-surface accuracy and supports high-frequency distance and normal queries for real-time perception, monitoring, and motion planning. Experiments on a tokamak vessel model demonstrate that ONDP provides continuous, sub-centimeter geometric fidelity. Crucially, benchmark results confirm that the proposed method achieves a query frequency exceeding 15 kHz for large-scale batches, representing a 5911× speed-up over mesh-based queries. This breakthrough performance enables its seamless integration with trajectory optimization and model-predictive control frameworks for confined-space robotic manipulation.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。