Abstract
k-nearest neighbors (KNN) is a classic supervised classification method widely used in pattern recognition and data mining. However, traditional KNN may lead to a decline in discriminant performance in high-dimensional feature spaces due to degraded distance measurements. At the same time, when class distributions are uneven, its equal-weight voting mechanism is prone to decision bias, thereby affecting the fairness of classification results. To improve the discrimination ability and adaptability of the KNN algorithm in a complex data environment, this paper proposes an enhanced k-Nearest Neighbors with G metric reconstruction and inverse class frequency weighting (GWKNN). Based on retaining the KNN algorithm model structure, this method introduces the Grey Wolf Optimizer to perform global adaptive reconstruction of the distance matrix between samples to more effectively characterize the nonlinear structure and semantic association in the feature space and alleviate the failure problem of traditional Euclidean distance in non-Euclidean space. To address the neighborhood decision bias caused by multi-category imbalance, an inverse class frequency weighting strategy based on category prior frequency was designed to suppress the dominant effect of the majority class from the voting mechanism level, thereby improving the model's sensitivity to minority class samples and the overall classification fairness. Comparative experiments across 12 public datasets show that the proposed GWKNN algorithm outperforms traditional KNN and other mainstream classification methods in terms of classification accuracy and adaptability, demonstrating strong overall performance and practical application potential.