A novel 3D indoor localization method integrating deep spatial feature augmentation and attention-based denoising

一种融合深度空间特征增强和基于注意力机制的去噪的新型3D室内定位方法

阅读:1

Abstract

The complexity of indoor environments and the high-dimensional, diverse nature of localization data pose significant challenges for three-dimensional (3D) indoor positioning systems. Existing methods often suffer from low positioning accuracy when training data is scarce, poor robustness to noise, and Limited capability to capture global spatial features, which restrict their applicability in real-world scenarios. Additionally, the collection of indoor positioning data requires substantial human effort, resulting in high data acquisition costs. Consequently, generating high-quality, high-density 3D positioning data from a Limited number of real samples has become a critical issue. To address these Limitations, this paper proposes a novel 3D indoor positioning method that integrates deep spatial feature enhancement and attention-based denoising. Specifically, a stacked variational autoencoder (SVAE) is used to extract structured deep spatial representations, while a Wasserstein generative adversarial network (WGAN) synthesizes realistic high-density samples to mitigate data sparsity. An attention mechanism is embedded in the encoder to improve global feature perception and spatial awareness, and controlled noise injection during training enhances robustness against noisy measurements. Experimental results show that, with only 10% of the UJIIndoorLoc dataset generated, the proposed method combined with a simple deep neural network (DNN) achieves 100% building localization accuracy, 94.7% floor localization accuracy, and reduces positioning error by 14.32%. Similar improvements are observed on the Tampere and UTSIndoorLoc datasets, with floor localization accuracies of 92.83% and 94.33% and positioning error reductions of 15.18% and 18.89%, respectively. These results demonstrate the effectiveness of the method in enhancing 3D indoor positioning with limited data.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。