Abstract
Big Data Classification (BDC) has become increasingly important across domains such as healthcare, e-commerce, and banking. However, challenges such as high dimensionality and class imbalance often degrade the performance of conventional machine learning (ML) models. This study proposes a hybrid framework that integrates meta-heuristic optimization with class imbalance handling to enhance BDC effectiveness. To address the class imbalance problem in both binary and multi-class datasets, a Hybrid Synthetic Minority Over-sampling Technique (HSMOTE) is introduced. HSMOTE generates synthetic minority samples by interpolating between closely located minority instances, improving the representation of rare classes. For robust feature selection, the Optimization Ensemble Feature Selection Model (OEFSM) is developed by combining the outputs of three algorithms: Fuzzy Weight Dragonfly Algorithm (FWDFA), Adaptive Elephant Herding Optimization (AEHO), and Fuzzy Weight Grey Wolf Optimization (FWGWO). These algorithms contribute diverse search strategies to improve feature relevance and reduce redundancy. To handle classification, the Ensemble Deep Dynamic Classifier Model (EDDCM) is proposed. EDDCM incorporates three deep learning (DL) architectures Density Weighted Convolutional Neural Network (DWCNN), Density Weighted Bi-Directional Long Short-Term Memory (DWBi-LSTM), and Weighted Autoencoder (WAE). Their outputs are aggregated using a dynamic ensemble strategy that considers both accuracy and diversity to improve final prediction reliability. All models are implemented in MATLAB (2014a), and performance is evaluated using precision, recall, F-measure, and accuracy. The proposed framework demonstrates improved classification results across various datasets, particularly under conditions of imbalance and high dimensionality.