Adapting Differential Evolution Algorithms For Continuous Optimization Via Greedy Adjustment Of Control Parameters

Open access

Abstract

Differential evolution (DE) presents a class of evolutionary and meta-heuristic techniques that have been applied successfully to solve many real-world problems. However, the performance of DE is significantly influenced by its control parameters such as scaling factor and crossover probability. This paper proposes a new adaptive DE algorithm by greedy adjustment of the control parameters during the running of DE. The basic idea is to perform greedy search for better parameter assignments in successive learning periods in the whole evolutionary process. Within each learning period, the current parameter assignment and its neighboring assignments are tested (used) in a number of times to acquire a reliable assessment of their suitability in the stochastic environment with DE operations. Subsequently the current assignment is updated with the best candidate identified from the neighborhood and the search then moves on to the next learning period. This greedy parameter adjustment method has been incorporated into basic DE, leading to a new DE algorithm termed as Greedy Adaptive Differential Evolution (GADE). GADE has been tested on 25 benchmark functions in comparison with five other DE variants. The results of evaluation demonstrate that GADE is strongly competitive: it obtained the best rank among the counterparts in terms of the summation of relative errors across the benchmark functions with a high dimensionality.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • [1] N. Xiong D. Molina M. Leon and F. Herrera A walk into metaheuristics for engineering optimization: Principles methods and recent trends International Journal of Computational Intelligence Systems vol. 8 no. 4 pp. 606-636 2015.

  • [2] N. Hansen and A. Ostermeier Completely derandomized self-adaptation in evolution strategies Evolutionary Computation vol. 9 no. 2 pp. 159-195 2001.

  • [3] F. Herrera and M. Lozano Two-loop real-coded genetic algorithms with adaptive control of mutation step size Applied Intelligence vol. 13 pp. 187-204 2000.

  • [4] D. Molina M. Lozano A. M. Sanchez and F. Herrera Memetic algorithms based on local search chains for large scale continuous optimization problems: Ma-ssw-chains Soft Computing vol. 15 pp. 2201-2220 2011.

  • [5] R. Storn and K. Price Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces Journal of Global Optimization vol. 11 no. 4 pp. 341 - 359 1997.

  • [6] J. Kenedy and R. C. Eberhart Particle swarm optimization in In Proc. IEEE Conference on Neural Networks 1995 pp. 1942-1948.

  • [7] D. Karaboga B. Gorkemli C.Ozturk and N. Karaboga A comprehensive survey: artificial bee colony (abc) algorithm and applications Artificial Intelligence Review vol. 42 no. 1 pp. 21-57 2012.

  • [8] M. Ali and A. Torn Population set based global optimization algorithms: Some modifications and numerical studies Computers and Operations Research vol. 31 pp. 1703-1725 2004.

  • [9] S. Garcia D. Molina M. Lozano and F. Herrera A study on the use of non-parametric tests for analyzing the evolutionary algorithmss behaviour: A case study on the cec2005special session on real parameter optimization Journal of Heuristics vol. 15 no. 6 pp. 617-644 2009.

  • [10] S. Das and N. Suganthan Differential evolution: A survey of the state-of-the-art in IEEE Transactions on Evolutionary Computation vol. 15 no. 1 2011 pp. 4-31.

  • [11] R. Gamperle S. D. Muller and P. Koumoutsakos A parameter study for differential evolution in Advances in intelligent systems fuzzy systems evolutionary computation vol. 10 2002 pp. 293-298.

  • [12] K. Zielinski P. Weitkemper R. Laur and K. D. Kammeyer Parameter study for differential evolution using a power allocation problem including interference cancellation in IEEE Congress on Evolutionary Computation 2006 pp. 1857-1864.

  • [13] J. Zhang and A. C. Sanderson An approximate gaussian model of differential evolution with spherical fitness functions in Proc. IEEE Congress on Evolutionary Computation 2007 pp. 2220-2228.

  • [14] J. Liu and J. Lampinen A fuzzy adaptive differential evolution algorithm Soft Computing vol. 9 no. 6 pp. 448-462 2005.

  • [15] F. Xue A. C. Sanderson P. P. Bonissone and R. J. Graves Fuzzy logic controlled multiobjective differential evolution in Proc. IEEE Conference on Fuzzy Systems 2005 pp. 720-725.

  • [16] A. Qin and P. Suganthan Self-adaptive differential evolution algorithm for numerical optimization The 2005 IEEE Congress on Evolutionary Computation vol. 2 pp. 1785-1791 2005.

  • [17] J. Zhang and A. Sanderson Jade: Adaptive differential evolution with optional external archive IEEE Transactions on Evolutionary Computation vol. 13 pp. 945-958 2009.

  • [18] S. M. Islam S. Das S. Ghoshand S. Roy and P. N. Suganthan An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization Systems Man and Cybernetics Part B: Cybernetics IEEE Transactions on vol. 42 no. 2 pp. 482-500 2012.

  • [19] Z. Yang K. Tang and X. Yao Scability of generalized adaptive differential evolution for large-scale continuous optimization Soft Computing vol. 15 no. 11 pp. 2141-2155 2001.

  • [20] R. Tanabe and A. Fukinga Success-history based parameter adaptation for differential evolution in 2013 IEEE Congress on Evolutionary Computation (CEC) Cancun Mexico 2013 pp. 71-78.

  • [21] M. Leon and N. Xiong Investigation of mutation strategies in differential evolution for solving global optimization problems in Artificial Intelligence and Soft Computing. springer June 2014 pp. 372-383.

  • [22] X. Yao Y. Liu and G. Lin Evolutionary programming made faster in Proc. IEEE Transactions on Evolutionary Computation vol. 3 no. 2 1999 pp. 82-102.

  • [23] P. N. Suganthan N. Hansen J. J. Liang K. Deb Y. P. Chen A. Auger and S. Tiwari Problem definitions and evaluation criteria for the cec 2005 special session on real-parameter optimization Technical Report Nanyang Technological University Singapore And KanGAL Report Number 2005005 (Kanpur Genetic Algorithms Laboratory IIT Kanpur) Tech. Rep. May 2005.

  • [24] D. Wolpert and W. Macready No free lunch theorems for optimization IEEE Transactions on Evolutionary Computation vol. 1 no. 1 pp. 67-82 1997.

  • [25] D. Whitley and J. Rowe Focused no free lunch theorems in Proc. Conf. Genetic Evolutionary Computing 2008 pp. 811-818.

  • [26] M. Leon and N. Xiong Using random local search helps in avoiding local optimum in diefferential evolution in Proc. Artificial Intelligence and Applications AIA2014 Innsbruck Austria 2014 pp. 413-420.

Search
Journal information
Impact Factor


CiteScore 2018: 4.70

SCImago Journal Rank (SJR) 2018: 0.351
Source Normalized Impact per Paper (SNIP) 2018: 4.066

Cited By
Metrics
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 407 242 10
PDF Downloads 216 149 6