Abstract
Background: While machine learning has advanced in medicine, its widespread use in clinical applications, especially in predicting breast cancer metastasis, is still limited. We have been dedicated to constructing a deep feed-forward neural network (DFNN) model to predict breast cancer metastasis n years in advance. However, the challenge lies in efficiently identifying optimal hyperparameter values through grid search, given the constraints of time and resources. Issues such as the infinite possibilities for continuous hyperparameters like L1 and L2, as well as the time-consuming and costly process, further complicate the task. Methods: To address these challenges, we developed the Single-Hyperparameter Grid Search (SHGS) strategy, serving as a preselection method before grid search. Our experiments with SHGS applied to DFNN models for breast cancer metastasis prediction focused on analyzing eight target hyperparameters (epochs, batch size, dropout, L1, L2, learning rate, decay, and momentum). Results: We created three figures, each depicting the experimental results obtained from three LSM-I-10+-year datasets. These figures illustrate the relationship between model performance and the target hyperparameter values. Our experiments achieved maximum test AUC scores of 0.770, 0.762, and 0.886 for the 10-year, 12-year, and 15-year datasets, respectively. For each hyperparameter, we analyzed whether changes in this hyperparameter would affect model performance, examined whether there were specific patterns, and explored how to choose values for the hyperparameter. Conclusions: Our experimental findings reveal that the optimal value of a hyperparameter is not only dependent on the dataset but is also significantly influenced by the settings of other hyperparameters. Additionally, our experiments suggest a reduced range of values for a target hyperparameter, which may be helpful for "low-budget" grid search. This approach serves as a foundation for the subsequent use of grid search to enhance model performance.