Abstract
BACKGROUND: Clinical reasoning is increasingly recognized as an important skill in the diagnosis of common and serious conditions. eCREST (electronic Clinical Reasoning Educational Simulation Tool), a clinical reasoning learning resource, was developed to support medical students to learn clinical reasoning. However, primary care teams now encompass a wider range of professional groups, such as physician assistants (PAs), who also need to develop clinical reasoning during their training. Understanding PAs' clinical reasoning processes is key to judging the transferability of learning resources initially targeted to medical students. OBJECTIVE: This exploratory study aimed to measure the processes of clinical reasoning undertaken on eCREST by PA students and compare PAs' reasoning processes with previous data collected on medical students. METHODS: Between 2017 and 2021, PA students and medical students used eCREST to learn clinical reasoning skills in an experimental or learning context. Students undertook 2 simulated cases of patients presenting with lung symptoms. They could ask questions, order bedside tests, and select physical exams during the case to help them form, reflect on, and reconsider diagnostic ideas and management strategies while completing a case. Exploratory analysis was undertaken by comparing students' data gathering, flexibility in diagnosis, and diagnostic ideas between medical and PA students. RESULTS: In total, 159 medical students and 54 PA students completed the cases. PAs were older (mean 27, SD 7 y vs mean 24, SD 4 y; P<.001) and more likely to be female (43/54, 80% vs 84/159, 53%; P<.001). Medical and PA students were similar in the proportion of essential questions asked (Case 1: mean 70.1 vs mean 73.2; P=.33; Case 2: mean 74.6 vs mean 70.9; P=.27), physical examinations requested (Case 1: mean 54.7 vs mean 54.0; P=.59; Case 2: mean 69.3 vs mean 67.5; P=.59), bedside tests selected (Case 1: mean 74.4 vs mean 83.3; P=.05; Case 2: mean 47.9 vs mean 50.0; P=.69), and number of times they changed their diagnoses (Case 1: mean 2.8 vs mean 2.8; P=.99; Case 2: mean 2.8 vs mean 2.5; P=.81). Both student groups improved in their diagnostic accuracy during the cases. CONCLUSIONS: These results provide suggestive evidence that medical and PA students had similar clinical reasoning styles when using an online training tool to support their diagnostic decision-making.