Evaluating the Accuracy of a Vision-Based Algorithm for Groundline Estimation in Trotting Horses Using Multiple Camera Angles

利用多角度摄像机评估基于视觉的算法在慢步马匹地面线估计中的准确性

阅读:2

Abstract

BACKGROUND: Equine lameness diagnosis largely relies on subjective visual assessments, which can be biased. Although marker-based methods, force plates and inertial measurement units (IMUs) provide objective measurements, they require specialized setups. Vision-based algorithms offer a portable, markerless alternative, but their accuracy needs thorough testing. OBJECTIVES: To evaluate a custom vision-based algorithm for estimating the groundline across multiple camera angles, including handheld use in horses trotting on a treadmill. STUDY DESIGN: Experimental comparative study. METHODS: Eight Standardbred trotter mares were recorded trotting on a high-speed treadmill using seven iPhones positioned at various heights and angles, including a handheld device. A trained deep neural network algorithm placed 2D keypoints on each video frame. Vertical Displacement Signals (VDS) for the eye, withers and croup (tuber sacrale) were computed relative to either an algorithm-estimated or a fixed treadmill groundline. Maximum (Maxdiff) and minimum (Mindiff) stride values were compared using Bland-Altman analysis, scatter plots and histograms. The effect of handheld use on variability and accuracy was assessed by comparing results from a handheld camera to those from a static camera. RESULTS: Groundline estimation closely matched the fixed reference, exhibiting near-zero mean angle error and low mean average error (MAE = 0.45°; n = 242.192). Maxdiff and Mindiff stride-level (n = 36.981) MAE were 0.5 mm, with clinically acceptable additional variability introduced by handheld use at the trial level (Maxdiff and Mindiff MAE < 1.8 mm; n = 357). MAIN LIMITATIONS: Treadmill-based data and a single breed/coat colour may limit generalizability to other settings. CONCLUSIONS: The vision-based algorithm accurately estimates the groundline and stride VDS parameters from various camera setups, including handheld. Further validation in diverse environments and against other objective gait analysis systems is recommended.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。