Abstract
Understanding how self-confidence fluctuates during cognitive activity and how these fluctuations relate to objective physiological signals remains a challenge in psychological and computational research. Existing multimodal datasets predominantly focus on stress, affect, or workload, and do not systematically capture experimentally manipulated self-confidence states. The CoSuBio dataset addresses this gap by providing a publicly available multimodal dataset collected under controlled confidence induction conditions. The dataset comprises synchronized EEG, peripheral physiological signals, behavioral task performance, and self-reported confidence measures from 34 participants who completed three within-subject phases: neutral, confidence-reducing, and confidence-enhancing. Continuous EEG (1 Hz), electrodermal activity, blood volume pulse, temperature, and accelerometer data were recorded alongside decision-making tasks, cognitive games, and puzzle-solving activities. Facial expression recordings are included for two participants who provided explicit consent. CoSuBio represents the first open dataset specifically designed to examine confidence-induced cognitive and behavioral changes using multimodal biosignals. It enables research on confidence estimation, performance prediction, and multimodal learning, and supports applications in adaptive learning systems, clinical psychology, affective computing, and human-computer interaction.