Correction of UAV LiDAR-derived grassland canopy height based on scan angle

基于扫描角度的无人机激光雷达草地冠层高度校正

阅读:1

Abstract

Grassland canopy height is a crucial trait for indicating functional diversity or monitoring species diversity. Compared with traditional field sampling, light detection and ranging (LiDAR) provides new technology for mapping the regional grassland canopy height in a time-saving and cost-effective way. However, the grassland canopy height based on unmanned aerial vehicle (UAV) LiDAR is usually underestimated with height information loss due to the complex structure of grassland and the relatively small size of individual plants. We developed canopy height correction methods based on scan angle to improve the accuracy of height estimation by compensating the loss of grassland height. Our method established the relationships between scan angle and two height loss indicators (height loss and height loss ratio) using the ground-measured canopy height of sample plots with 1×1m and LiDAR-derived heigh. We found that the height loss ratio considering the plant own height had a better performance (R(2) = 0.71). We further compared the relationships between scan angle and height loss ratio according to holistic (25-65cm) and segmented (25-40cm, 40-50cm and 50-65cm) height ranges, and applied to correct the estimated grassland canopy height, respectively. Our results showed that the accuracy of grassland height estimation based on UAV LiDAR was significantly improved with R(2) from 0.23 to 0.68 for holistic correction and from 0.23 to 0.82 for segmented correction. We highlight the importance of considering the effects of scan angle in LiDAR data preprocessing for estimating grassland canopy height with high accuracy, which also help for monitoring height-related grassland structural and functional parameters by remote sensing.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。