Abstract
BACKGROUND: One of the disadvantages of the multilayer perception (MLP), which is a machine learning (ML) algorithm used in various fields, includes the uncontrollable growth of the number of total parameters, which may make MLP redundant in such high dimensions, and the uncontrollable growing stack of layers that ignores spatial information. Optimization algorithms were developed to determine the optimum number of parameters for MLP. METHODS: In this paper, the performances of the Genetic Algorithm (GA), Grasshopper Optimization Algorithm (GOA), and Covariance Matrix Adaptation Evolution Strategy (CMA-ES) are compared. The study also sought to determine the impact of sample size variations on these optimization algorithms. A dataset on the direct marketing campaigns of a Portuguese banking institution from the UCI Machine Learning Repository with a sample size of 4 521 was used. Synthetic Minority Oversampling Technique (SMOTE) was applied to balance the binary dependent variables for the training data across various sample sizes. RESULTS: Based on the classification accuracy, specificity, sensitivity, precision, F-score, and execution time, the MLP based on CMA-ES (CMA-ES-MLP) was identified as the best classifier overall, as it maintained high rates of these classification metrics and was the second fastest to train. CMA-ES-MLP with a training sample of 5 114 was our ideal classifier, and it competes well with the classifiers that have been built by previous studies that used the same dataset. CONCLUSIONS: The study found no consistent increase or decrease in the classification performance of the algorithms as the sample size increased, and the metrics fluctuated rapidly across sample sizes. It is recommended that future studies be conducted to compare the best-performing classifiers identified in previous studies with the CMA-ES-MLP in this study under the same experimental conditions.