邹睿, 焦慧, 龙文. 求解函数优化和特征选择的改进金豺狼优化算法[J]. 信阳师范学院学报(自然科学版), 2024, 37(1): 113-119. DOI: 10.3969/j.issn.1003-0972.2024.01.017
引用本文: 邹睿, 焦慧, 龙文. 求解函数优化和特征选择的改进金豺狼优化算法[J]. 信阳师范学院学报(自然科学版), 2024, 37(1): 113-119. DOI: 10.3969/j.issn.1003-0972.2024.01.017
ZOU Rui, JIAO Hui, LONG Wen. Improved Golden Jackal Optimization Algorithm for Solving Function Optimization and Feature Selection[J]. Journal of Xinyang Normal University (Natural Science Edition), 2024, 37(1): 113-119. DOI: 10.3969/j.issn.1003-0972.2024.01.017
Citation: ZOU Rui, JIAO Hui, LONG Wen. Improved Golden Jackal Optimization Algorithm for Solving Function Optimization and Feature Selection[J]. Journal of Xinyang Normal University (Natural Science Edition), 2024, 37(1): 113-119. DOI: 10.3969/j.issn.1003-0972.2024.01.017

求解函数优化和特征选择的改进金豺狼优化算法

Improved Golden Jackal Optimization Algorithm for Solving Function Optimization and Feature Selection

  • 摘要: 针对基本金豺狼优化算法(Golden Jackal Optimization, GJO)在解决高维优化问题时存在计算精度低、开发能力弱、容易陷入局部最优的缺点,提出一种改进GJO算法(I-GJO)。在改进算法中,设计一种基于正弦函数的非线性能量因子替代原随机递减能量因子,以平衡算法在搜索过程中的全局探索和局部开发能力。在算法迭代后期引入翻筋斗学习策略,从而扩大群体搜索范围和改善解的精度。为了验证I-GJO算法的有效性,选取6个基准函数优化问题进行数值实验,并与灰狼优化、海鸥优化算法和基本GJO算法比较。结果表明,I-GJO获得较高的精度和较快的收敛速度。最后利用I-GJO算法求解特征选择问题,对16个基准数据集的数值结果显示,改进算法能有效去除冗余特征和提高分类精度。

     

    Abstract: The basic Golden Jackal Optimization algorithm (GJO) had several drawbacks such as low computation precision, poor exploitation, and ease to get stuck in a local optima when solving high-dimensional optimization problems. An improved GJO algorithm (I-GJO) was proposed. In I-GJO, the original randomly decreasing energy factor was replaced by a nonlinear decreasing factor based on sine function to balance the global exploration and local exploitation abilities of algorithm during the search process. In the later iterative stage of algorithm, a somersault learning strategy was introduced to expand the population search region and improved the solution precision. In order to verify the effectiveness of the proposed I-GJO algorithm, six benchmark function optimization problems were selected for experiment. The experimental results indicated that I-GJO had higher precision and faster convergence speed than the Grey Wolf optimizer (GWO), Seagull Optimization Algorithm (SOA) and the basic GJO algorithm. Finally, I-GJO was applied to solve the feature selection problem. The numerical results on sixteen benchmark datasets showed that I-GJO could effectively remove the redundant features and improve the classification accuracy.

     

/

返回文章
返回