A BIM-Guided Virtual-to-Real Framework for Component-Level Semantic Segmentation of Construction Site Point Clouds

基于BIM的虚拟到现实框架,用于施工现场点云的组件级语义分割

阅读:2

Abstract

LiDAR point cloud semantic segmentation is pivotal for scan-to-BIM workflows; however, contemporary deep learning approaches remain constrained by their reliance on extensive annotated datasets, which are challenging to acquire in actual construction environments due to prohibitive labeling costs, structural occlusion, and sensor noise. This study proposes a BIM-guided Virtual-to-Real (V2R) framework that requires no real annotations. The method is trained entirely on a large synthetic point cloud (SPC) dataset consisting of 132 scans and approximately 8.75×109 points, generated directly from BIM models with component-level labels. A multi-feature fusion network combines the global contextual modeling of PCT with the local geometric encoding of PointNet++, producing robust representations across scales. A learnable point cloud augmentation module and multi-level domain adaptation strategies are incorporated to mitigate differences in noise, density, occlusion, and structural variation between synthetic and real scans. Experiments on real construction floors from high-rise residential buildings, together with the BIM-Net benchmark, show that the proposed method achieves 70.89% overall accuracy, 53.14% mean IoU, 69.67% mean accuracy, 54.75% FWIoU, and 59.66% Cohen's κ, consistently outperforming baseline models. The Fusion model achieves 73 of 80 best scene-metric results and 31 of 70 best component-level scores, demonstrating stable performance across the evaluated scenes and floors. These results confirm the effectiveness of BIM-generated SPC and indicate the potential of the V2R framework for BIM-reality updates and automated site monitoring within similar building contexts.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。