Abstract
Indoor activity monitoring methods promise the wellbeing and security of elderly and visually challenging individuals living in their homes. These methods use numerous technologies and sensors to monitor daily actions, namely movement, sleep patterns, and medication compliance, presenting valued opinions of the consumer's daily life and complete health. The accuracy and adaptability of the deep learning (DL) model make human activity recognition (HAR) a critical device to improve effectiveness, security, and modified understandings in indoor areas. Using DL techniques, HAR transforms indoor monitoring by permitting particular detection and experience of human actions. DL techniques automatically remove and learn discriminating characteristics, making them suitable for identifying composite human activities in sensor data. Nevertheless, selecting the appropriate DL architecture and enhancing its parameters was important for superior solutions. This study presents a novel Indoor Activity Monitoring for Visually Impaired People with Hybrid Deep Learning for Enhanced Safety (IAMVIP-HDLES) methodology for IoT environments. The IAMVIP-HDLES methodology is designed to monitor and recognize indoor activities of visually impaired people in a real-time environment. The IAMVIP-HDLES approach primarily performs Z-score normalization as a data pre-processing technique for standardizing the input data, guaranteeing uniformity, and improving the system's performance. For feature selection, the Osprey-Cauchy-sparrow search algorithm (OCSSA) is employed to identify the most appropriate features from the raw data. In addition, the hybrid method, which is comprised of a convolutional neural network, bidirectional long short-term memory, and attention mechanisms (CNN-BiLSTM-Attention), is used for monitoring indoor activities. Finally, the hyperparameter selection process based on the improved whale optimization algorithm (IWOA) is performed to enhance the classification results of the CNN-BiLSTM-Attention method. Experimental studies are performed, and the results are inspected using the UCI-HAR dataset. The comparative analysis of the IAMVIP-HDLES method specified a superior accuracy value of 96.76% over existing techniques.