Fusing Multiple Strategies in Population-Based Optimization Algorithm

Article Preview

Abstract:

A multi-strategy based population optimization, referred to MSPO, is proposed in this paper. The algorithm is developed by hybridizing four different population-based algorithms, bare bone particle swarm optimization, quantum-behaved particle swarm optimization, differential evolution and opposition-based learning. It aims at enhancing the exploration and exploitation capability of population based algorithm for general optimization problem. These four options are randomly selected with equal probability during the search process. The proposed algorithm is validated against test functions and then compares its performance with those of particle swarm optimization and bare bone particle swarm optimization. Numerical results show that the performance is increased greatly both in solution quality and convergent speed.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

1407-1411

Citation:

Online since:

May 2015

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2015 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] C.H. Chen: A revised bare bone particle swarm optimizer and its variant, International Conference on Fuzzy Theory and Its Applications, Dec. 6-8, Taipei, Taiwan. (2013) 488-493.

DOI: 10.1109/ifuzzy.2013.6825466

Google Scholar

[2] D.E. Goldberg: Genetic algorithms in search, optimization and machine learning, Addison-Wesley (1989).

Google Scholar

[3] D. Karaboga and B. Basturk: A powerful and efficient algorithm for numerical function optimization: Artificial Bee Colony (ABC) algorithm, Journal of Global Optimization, 39 (2007) 459-471.

DOI: 10.1007/s10898-007-9149-x

Google Scholar

[4] H.R. Tizhoosh: Opposition-based learning: A new scheme for machine intelligence, Proceedings of International Conference on Computation Intelligence for Modeling Control and Automation 1, Austria. (2005) 695-707.

Google Scholar

[5] J. Kennedy and R.C. Eberhart: Particle swarm optimization, Proceedings of IEEE international conference on neural networks, 4 (1995) 1942-(1948).

Google Scholar

[6] J. Kennedy: Bare bone particle swarms, Proceedings of the IEEE Swarm Intelligence Symposium, IEEE Press (2003) 80-87.

Google Scholar

[7] J. Sun, B. Feng and W. Xu: Particle swarm optimization with particles having quantum behavior, IEEE Congress on Evolutionary Computation. (2004a) 325-331.

DOI: 10.1109/cec.2004.1330875

Google Scholar

[8] R. Storn and K. Price: Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces, Journal of Global Optimization, 11 (1997) 341-359.

Google Scholar

[9] T. Back, U. Hammel and H.P. Schwefel: Evolutionary computation: Comments on the history and current state, IEEE Trans. on Evolutionary Computation, 1 (1997) 3-17.

DOI: 10.1109/4235.585888

Google Scholar

[10] T.W. Liao: Two hybrid differential evolution algorithms for engineering design optimization, Applied Soft Computing, 10 (2010) 1188–1199.

DOI: 10.1016/j.asoc.2010.05.007

Google Scholar