Explicit Features Versus Implicit Spatial Relations in Geomorphometry: A Comparative Analysis for DEM Error Correction in Complex Geomorphological Regions

地貌测量中显性特征与隐性空间关系:复杂地貌区域数字高程模型误差校正的比较分析

阅读:1

Abstract

Global Digital Elevation Models (DEMs) exhibit systematic biases constrained by acquisition geometry and surface penetration. This study aims to evaluate whether the increasing complexity of geometric deep learning (e.g., Graph Neural Networks, GNNs) is justified by performance gains over established feature engineering paradigms (e.g., XGBoost) under the constraints of sparse altimetry supervision. We established a rigorous comparative framework across four mainstream products-ALOS World 3D, Copernicus DEM, SRTM GL1, and TanDEM-X-using Sichuan Province, China, as a representative natural laboratory. Our results reveal a fundamental scale mismatch (where the ~485 m average spacing of sampled altimetry footprints dwarfs the local terrain resolution): despite their topological complexity, Hybrid GNN models fail to establish a statistically significant accuracy advantage over the systematically optimized XGBoost baseline, demonstrating RMSE parity. Mechanistically, we uncover a critical divergence in decision logic: XGBoost relies on a stable "Physics Skeleton" consistently dominated by deterministic features (terrain aspect and vegetation density), whereas GNNs exhibit severe "Attribution Stochasticity" (ρ ≈ 0.63-0.77). The GNN component acts as a residual-dependent latent feature learner rather than discovering universal topological laws. We conclude that for geospatial regression tasks relying on sparse supervision, "Physics Trumps Geometry." A "Feature-First" paradigm that prioritizes robust, domain-knowledge-based physical descriptors outweighs the indeterminate complexity of "Black Box" architectures. This study underscores the imperative of prioritizing explanatory stability over marginal accuracy gains to foster trusted Geo-AI.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。