Abstract
This dataset contains synchronised electrocardiogram (ECG), electrodermal activity (EDA), skin temperature, and accelerometer recordings collected from 44 adult participants while viewing 18 emotionally evocative video clips. The stimuli were selected from validated affective video repositories and represent diverse emotional categories, including neutral, positive, fear-related, social stress, sadness, humour, anger, shame, and interpersonal conflict. Physiological signals were acquired using BITalino (r)evolution boards at a sampling rate of 1000 Hz and stored as timestamped raw text files, with one file per stimulus to ensure consistent segmentation across participants. Each participant folder also includes structured demographic metadata in JSON format. Emotional categories correspond to stimulus-level intended elicitation targets and are not treated as participant-level ground truth. The fixed presentation sequence enables temporal comparability across subjects while preserving alignment of multimodal recordings. The dataset supports research in affective computing, psychophysiology, stress analysis, and emotion recognition, and complements existing resources by providing synchronised raw physiological signals aligned with discrete, validated video stimuli. The dataset is openly accessible via Mendeley Data.