A minimal recurrent neural network models the robustness of interleaved practice on motor sequence learning

最小循环神经网络模拟了交错练习对运动序列学习的鲁棒性

阅读:1

Abstract

Effective motor skill learning depends critically on the structure of practice, not solely on practice volume. This study explores the differential learning outcomes when multiple procedural skills are acquired simultaneously through interleaved practice (IP) versus repetitive practice (RP) structures. In humans, IP yields superior long-term retention compared to RP, which often leads to significant forgetting. To explore the computational basis of how distinct practice structures impact motor sequence learning, we implemented a minimal recurrent network-the Elman network-to perform a sequential motor task. The network was trained on sequences presented in either RP or IP order. While RP led to faster error reduction during training, IP produced both superior trained set performance and better generalization to novel sequences. These findings demonstrate that the benefits of IP emerge from the interaction between input variability and basic temporal recurrence alone, without requiring complex biological plasticity mechanisms or specialized contextual processes. Our results suggest a shared computational principle linking human motor learning with catastrophic forgetting in artificial neural networks: variability across practice contexts forces the formation of more robust and generalizable internal representations. This work offers a parsimonious account of IP benefits and informs both theoretical models of learning and strategies for optimizing motor rehabilitation.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。