Automation Expectation Mismatch: Incorrect Prediction Despite Eyes on Threat and Hands on Wheel

自动化预期不符:尽管时刻关注威胁并紧握方向盘,预测仍然错误。

阅读:1

Abstract

OBJECTIVE: The aim of this study was to understand how to secure driver supervision engagement and conflict intervention performance while using highly reliable (but not perfect) automation. BACKGROUND: Securing driver engagement-by mitigating irony of automation (i.e., the better the automation, the less attention drivers will pay to traffic and the system, and the less capable they will be to resume control) and by communicating system limitations to avoid mental model misconceptions-is a major challenge in the human factors literature. METHOD: One hundred six drivers participated in three test-track experiments in which we studied driver intervention response to conflicts after driving highly reliable but supervised automation. After 30 min, a conflict occurred wherein the lead vehicle cut out of lane to reveal a conflict object in the form of either a stationary car or a garbage bag. RESULTS: Supervision reminders effectively maintained drivers' eyes on path and hands on wheel. However, neither these reminders nor explicit instructions on system limitations and supervision responsibilities prevented 28% (21/76) of drivers from crashing with their eyes on the conflict object (car or bag). CONCLUSION: The results uncover the important role of expectation mismatches, showing that a key component of driver engagement is cognitive (understanding the need for action), rather than purely visual (looking at the threat), or having hands on wheel. APPLICATION: Automation needs to be designed either so that it does not rely on the driver or so that the driver unmistakably understands that it is an assistance system that needs an active driver to lead and share control.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。