Towards Explainable Multimodal Sensing for Swimming Analysis: Early Findings from the SWIM-360 Project

面向游泳分析的可解释多模态传感:SWIM-360 项目的初步发现

阅读:2

Abstract

Swimming performance analysis increasingly depends on multimodal sensing systems that capture physiological and biomechanical signals in real-world aquatic environments. While progress has been made in sensor fidelity and automated analysis, the interpretability of these systems remains limited, constraining their uptake in coaching practice. This paper presents early findings from the SWIM-360 project, which investigates how explainable artificial intelligence (XAI) can support transparent and actionable insights for swimming performance. We report preliminary results from EO SwimBETTER and TrainRed sensors, together with proof-of-concept outputs from video-based pose estimation. In parallel, we introduce mock-up visualisations and interaction concepts designed to elicit coach feedback on requirements for explainability. A qualitative questionnaire with eight professional swimming coaches was conducted to elicit requirements for explainable feedback. Their responses informed the design of a multimodal, coach-centred explainability framework. Rather than providing a fully integrated model, the paper proposes a methodological framework that combines multimodal sensing with explainability-driven design principles. Our findings highlight both the feasibility and the challenges of translating sensor data into interpretable knowledge for athletes and coaches. By embedding explainability at the earliest design stage, this study proposes an explainability-driven design framework linking multimodal sensing and user requirements. These early findings highlight how XAI principles can guide the creation of trustworthy, coach-centred decision-support tools in aquatic sports.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。