mixtur: An R package for designing, analysing, and modelling continuous report visual short-term memory studies

mixtur:一个用于设计、分析和建模连续报告视觉短期记忆研究的 R 包

阅读:1

Abstract

Visual short-term memory (vSTM) is often measured via continuous-report tasks whereby participants are presented with stimuli that vary along a continuous dimension (e.g., colour) with the goal of memorising the stimulus features. At test, participants are probed to recall the feature value of one of the memoranda in a continuous manner (e.g., by clicking on a colour wheel). The angular deviation between the participant response and the true feature value provides an estimate of recall precision. Two prominent models of performance on such tasks are the two- and three-component mixture models (Bays et al., Journal of Vision, 9(10), Article 7, 2009; Zhang and Luck, Nature, 453(7192), 233-235, 2008). Both models decompose participant responses into probabilistic mixtures of: (1) responses to the true target value based on a noisy memory representation; (2) random guessing when memory fails. In addition, the three-component model proposes (3) responses to a non-target feature value (i.e., binding errors). Here we report the development of mixtur, an open-source package written for the statistical programming language R that facilitates the fitting of the two- and three-component mixture models to continuous report data. We also conduct simulations to develop recommendations for researchers on trial numbers, set sizes, and memoranda similarity, as well as parameter recovery and model recovery. In the Discussion, we discuss how mixtur can be used to fit the slots and the slots-plus-averaging models, as well as how mixtur can be extended to fit explanatory models of visual short-term memory. It is our hope that mixtur will lower the barrier of entry for utilising mixture modelling.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。