Abstract
Accurate distance perception and collision reasoning are crucial for robotic manipulation in the confined interior of tokamak vacuum vessels. Traditional mesh- or voxel-based methods suffer from discretization artifacts, discontinuities, and heavy memory requirements, making them unsuitable for continuous geometric reasoning and optimization-based planning. This paper presents an Occupancy-Aware Neural Distance Perception (ONDP) framework that serves as a compact and differentiable geometric sensor for manipulator obstacle avoidance in reactor-like environments. To address the inadequacy of conventional sampling methods in such constrained environments, we introduce a Physically-Stratified Sampling strategy. This approach moves beyond heuristic adaptation to explicitly dictate data distribution based on specific engineering constraints. By injecting weighted quotas into critical safety buffers and enforcing symmetric boundary constraints, we ensure robust gradient learning in high-risk regions. A lightweight neural network is trained directly in physical units (millimeters) using a mean absolute error loss, ensuring strict adherence to engineering tolerances. The resulting model achieves approximately 2-3 mm near-surface accuracy and supports high-frequency distance and normal queries for real-time perception, monitoring, and motion planning. Experiments on a tokamak vessel model demonstrate that ONDP provides continuous, sub-centimeter geometric fidelity. Crucially, benchmark results confirm that the proposed method achieves a query frequency exceeding 15 kHz for large-scale batches, representing a 5911× speed-up over mesh-based queries. This breakthrough performance enables its seamless integration with trajectory optimization and model-predictive control frameworks for confined-space robotic manipulation.