Expert Systems with Applications, cilt.307, 2026 (SCI-Expanded, Scopus)
Metaheuristic optimization algorithms play a crucial role in solving complex real-world problems. However, many existing methods face persistent limitations, such as a suboptimal balance between exploration and exploitation and a pronounced susceptibility to local optima. The recent starfish optimization algorithm (SFOA) shows promise but suffers from fixed parameters and limited search diversity. To overcome these challenges, this study introduces the modified starfish optimization algorithm (M-SFOA), an enhanced version of the SFOA. M-SFOA incorporates three key enhancements: an exponentially decreasing adaptive step size, the integration of a momentum vector, and a multi-candidate local search strategy. The proposed algorithm was evaluated on 29 standard benchmark functions and the CEC-2022 test suite. Experimental results demonstrate that M-SFOA consistently outperforms or is at least competitive with the original SFOA. Furthermore, statistical analysis using the Wilcoxon signed-rank test confirms that the performance differences between M-SFOA and the baseline algorithm are statistically significant. Additionally, to demonstrate its practical efficacy, M-SFOA was tested for metaheuristic-based hyperparameter optimization in machine learning and deep learning problems. Comparative studies against widely used metaheuristics—such as differential evolution (DE), artificial bee colony (ABC), firefly algorithm (FA), covariance matrix adaptation evolution strategy (CMA-ES), success-history-based parameter adaptation for differential evolution (SHADE), harris hawks optimization (HHO), arithmetic optimization algorithm (AOA), and whale optimization algorithm (WOA)—reveal that M-SFOA achieves superior performance. The findings indicate that M-SFOA provides more effective solutions than existing algorithms, establishing it as a robust optimizer for both theoretical benchmarks and complex real-world applications, particularly for multi-dimensional hyperparameter tuning.