The Value of Individual Screen Response Time in Predicting Student Test Performance: Evidence from TIMSS 2019 Problem Solving and Inquiry Tasks

个人屏幕反应时间在预测学生测试表现中的价值:来自 TIMSS 2019 问题解决和探究任务的证据

阅读:1

Abstract

The time students spend on answering a test item (i.e., response time) and its relationship to performance can vary significantly from one item to another. Thus, using total or average response time across all items to predict overall test performance may lead to a loss of information, particularly with respect to within-person variability, which refers to fluctuations in a student's standardized response times across different items. This study aims to demonstrate the predictive and explanatory value of including within-person variability in predicting and explaining students' test scores. The data came from 13,829 fourth-grade students who completed the mathematics portion of Problem Solving and Inquiry (PSI) tasks in the 2019 Trends in International Mathematics and Science Study (TIMSS). In this assessment, students navigated through a sequence of interactive screens, each containing one or more related items, while response time was recorded at the screen level. This study used a profile analysis approach to show that students' standardized response times-used as a practical approximation of item-level timing-varied substantially across screens, indicating within-person variability. We further decompose the predictive power of response time for overall test performance into pattern effect (the predictive power of within-person variability in response time) and level effect (the predictive power of the average response time). Results show that the pattern effect significantly outweighed the level effect, indicating that most of the predictive power of response time comes from within-person variability. Additionally, each screen response time had unique predictive power for performance, with the relationship varying in strength and direction. This finding suggests that fine-grained response time data can provide more information to infer the response processes of students in the test. Cross-validation and analyses across different achievement groups confirmed the consistency of results regarding the predictive and explanatory value of within-person variability. These findings offer implications for the design and administration of future educational assessments, highlighting the potential benefits of collecting and analyzing more fine-grained response time data as a predictor of test performance.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。