Evaluating and optimising performance of multi-species call recognisers for ecoacoustic restoration monitoring

评估和优化用于生态声学修复监测的多物种鸣叫识别器的性能

阅读:1

Abstract

Monitoring the effect of ecosystem restoration can be difficult and time-consuming. Autonomous sensors, such as acoustic recorders, can aid monitoring across long time scales. This project successfully developed, tested and implemented call recognisers for eight species of frog in the Murray-Darling Basin. Recognisers for all but one species performed well and substantially better than many species recognisers reported in the literature. We achieved this through a comprehensive development phase, which carefully considered and refined the representativeness of training data, as well as the construction (amplitude cut-off) and the similarity thresholds (score cut-offs) of each call template used. Recogniser performance was high for almost all species examined. Recognisers for Crinia signifera, Limnodynastes fletcherii, Limnodynastes dumerilii, Litoria peronii and Crinia parinsignifera all performed well, with most templates having receiver operating characteristics values (the proportion of true positive and true negatives) over 0.7, and some much higher. Recognisers for L. peronii, L. fletcherii and L. dumerilii performed particularly well in the training data set, which allowed for responses to environmental watering events, a restoration activity, to be clearly observed. While slightly more involved than building recognisers using commercial packages, the workflows ensure that a high-quality recogniser can be built and the performance fine-tuned using multiple parameters. Using the same framework, recognisers can be improved on in future iterations. We believe that multi-species recognisers are a highly effective and precise way to detect the effects of ecosystem restoration.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。