InteBOMB: Integrating generic object tracking and segmentation with pose estimation for animal behavior analysis

InteBOMB:将通用目标跟踪和分割与姿态估计相结合,用于动物行为分析

阅读:2

Abstract

Advancements in animal behavior quantification methods have driven the development of computational ethology, enabling fully automated behavior analysis. Existing multi-animal pose estimation workflows rely on tracking-by-detection frameworks for either bottom-up or top-down approaches, requiring retraining to accommodate diverse animal appearances. This study introduces InteBOMB, an integrated workflow that enhances top-down approaches by incorporating generic object tracking, eliminating the need for prior knowledge of target animals while maintaining broad generalizability. InteBOMB includes two key strategies for tracking and segmentation in laboratory environments and two techniques for pose estimation in natural settings. The "background enhancement" strategy optimizes foreground-background contrastive loss, generating more discriminative correlation maps. The "online proofreading" strategy stores human-in-the-loop long-term memory and dynamic short-term memory, enabling adaptive updates to object visual features. The "automated labeling suggestion" technique reuses the visual features saved during tracking to identify representative frames for training set labeling. Additionally, the "joint behavior analysis" technique integrates these features with multimodal data, expanding the latent space for behavior classification and clustering. To evaluate the framework, six datasets of mice and six datasets of non-human primates were compiled, covering laboratory and natural scenes. Benchmarking results demonstrated a 24% improvement in zero-shot generic tracking and a 21% enhancement in joint latent space performance across datasets, highlighting the effectiveness of this approach in robust, generalizable behavior analysis.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。