Abstract
This study examined how task complexity and self-regulated learning (SRL) support strategies relate to learners' cognitive load and emotional engagement in a Metaverse Learning Environment (MLE) using multimodal indicators. College students (N = 42) completed three game-embedded tasks of low, mid, and high complexity while receiving cognitive-regulated support (CS), motivational-regulated support (MS), or no additional support (control group, CG). Learners' cognitive and emotional engagement were assessed using self-reported mental effort and multimodal measures for process approaches, an EEG-derived mental workload index (beta/[alpha + theta]), frontal alpha asymmetry (FAA), and facial expression analysis (FEA) metrics of task-facing orientation and engagement. Task complexity showed robust but non-monotonic effects across modalities. The mid-complexity task elicited higher EEG workload and emotional engagement than the low- and high-complexity tasks, whereas task-facing orientation was lower in the high-complexity task than in the low- and mid-complexity tasks. FAA was also higher in the mid-complexity task than in the high-complexity task. SRL support strategies were modality-specific: the control group reported higher mental effort than the CS group, and the CS group showed higher expressive engagement than both the MS and CG groups during the mid-complexity task. These findings suggest that cognitive support may reduce subjective strain while sustaining emotional involvement under moderate challenge. More broadly, the results highlight the value of multimodal measurement for distinguishing perceived effort from physiological and behavioral indicators of learner experience in immersive learning environments.