Abstract
This study proposes a Hybrid Strategy Improved Dingo Optimization Algorithm (HSIDOA), designed to address the limitations of the standard DOA in complex optimization tasks, including its tendency to fall into local optima, slow convergence speed, and inefficient boundary search. The HSIDOA integrates a quadratic interpolation search strategy, a horizontal crossover search strategy, and a centroid-based opposition learning boundary-handling mechanism. By enhancing local exploitation, global exploration, and out-of-bounds correction, the algorithm forms an optimization framework that excels in convergence accuracy, speed, and stability. On the CEC2017 (30-dimensional) and CEC2022 (10/20-dimensional) benchmark suites, the HSIDOA achieves significantly superior performance in terms of average fitness, standard deviation, convergence rate, and Friedman test rankings, outperforming seven mainstream algorithms including MLPSO, MELGWO, MHWOA, ALA, HO, RIME, and DOA. The results demonstrate strong robustness and scalability across different dimensional settings. Furthermore, HSIDOA is applied to multi-level threshold image segmentation, where Otsu's maximum between-class variance is used as the objective function, and PSNR, SSIM, and FSIM serve as evaluation metrics. Experimental results show that HSIDOA consistently achieves the best segmentation quality across four threshold levels (4, 6, 8, and 10 levels). Its convergence curves exhibit rapid decline and early stabilization, with stability surpassing all comparison algorithms. In summary, HSIDOA delivers comprehensive improvements in global exploration capability, local exploitation precision, convergence speed, and high-dimensional robustness. It provides an efficient, stable, and versatile optimization method suitable for both complex numerical optimization and image segmentation tasks.