Elsevier

Applied Soft Computing

Volume 60, November 2017, Pages 281-296
Applied Soft Computing

Generation of Particle Swarm Optimization algorithms: An experimental study using Grammar-Guided Genetic Programming

https://doi.org/10.1016/j.asoc.2017.06.040Get rights and content

Highlights

  • The PSO performance depends on the fine tuning of different parameters

  • We contrasted the performance of four well-known GGGP approaches to evolve PSO algorithms

  • Experiments were performed on 32 optimization problems with different levels of difficulty

  • The results showed the generated algorithms achieved competitive results when compared to state-of-the-art algorithms.

Abstract

Particle Swarm Optimization (PSO) is largely used to solve optimization problems effectively. Nonetheless, the PSO performance depends on the fine tuning of different parameters. To make the algorithm design process more independent from human intervention, some researchers have treated this task as an optimization problem. Grammar-Guided Genetic Programming (GGGP) algorithms, in particular, have been widely studied and applied in the context of algorithm optimization. GGGP algorithms produce customized designs based on a set of production rules defined in the grammar, differently from methods that simply select designs in a pre-defined limited search space. Although GGGP algorithms have been largely used in other contexts, they have not been deeply investigated in the generation of PSO algorithms. Thus, this work applies GGGP algorithms in the context of PSO algorithm design problem. Herein, we performed an experimental study comparing different GGGP approaches for the generation of PSO algorithms. The main goal is to perform a deep investigation aiming to identify pros and cons of each approach in the current task. In the experiments, a comparison between a tree-based GGGP approach and commonly used linear GGGP approaches for the generation of PSO algorithms was performed. The results showed that the tree-based GGGP produced better algorithms than the counterparts. We also compared the algorithms generated by the tree-based technique to state-of-the-art optimization algorithms, and it achieved competitive results.

Introduction

Particle Swarm Optimization (PSO) is a widely used meta-heuristic which caught the attention of many researchers due to its competitive results achieved in different applications [1]. Different studies were conducted over the years in order to enhance the performance of the standard PSO algorithm [1]. Among these works, some have proposed new mechanisms to control the particles’ velocity and others developed novel topologies of communication to balance the particles’ ability to explore large areas in the search space and to refine the search once promising regions of the search space are found. There are works that also incorporated genetic operators to increase the diversity into the search space. A great number of developments has shown to be useful and contributed significantly to the PSO's performance. Due to this diversity, another challenge took place: the PSO algorithm strongly depends on the adequate choice of parameters and components (i.e., a design) to perform well on a specific optimization problem [1]. This paper addresses the problem of designing PSO algorithms, which is to define a suitable configuration of parameters and components for the PSO algorithm when employed to a given optimization problem.

The field of hyper-heuristics has become an emergent research direction regarding the problem of algorithm optimization [2]. Hyper-heuristics are search methods (meta-heuristics) that operate on a search space of heuristics (or algorithms and their components). Originally, hyper-heuristic approaches were proposed to select algorithms, parameter values and/or components of algorithms from a pre-determined and limited search space [3], [4]. Recently, hyper-heuristics have been developed to generate algorithms from specified components, functions, and complex programming elements. The advantage of generation hyper-heuristics regarding the selection ones is their flexibility to create new algorithms for a given problem or a class of problems [2]. One of the most widely used generation hyper-heuristics is the Genetic Programming (GP) [5]. This popularity comes from its flexibility to generate more sophisticated and customized algorithms [6]. The advent of generation hyper-heuristics, and GP in particular, is responsible for advances in Combinatorial Optimization and Machine Learning/Data Mining. In combinatorial optimization, GP was employed to solve different optimization problems, such as bin packing [7], [8], satisfiability [9], [10], scheduling [11], [12] and travelling salesman problem [13]. In the machine learning and data mining areas, GP was used to optimize classification models [14], [15], [16], [17], [18], discovery of classification rules [19], [20], knowledge discovery [21] and other tasks [22].

Although GP has achieved considerable advances in different areas, it may present a critical problem (no closing property satisfiability) that allows the creation of syntactically incorrect solutions, that are, infeasible solutions. This inherent drawback of GP causes that many solutions have to be eliminated during the search process. New GP variations, such as Grammar-Guided Genetic Programming (GGGP) [23], have been proposed to overcome this limitation. GGGP generates algorithms considering the production rules determined in a grammar. This grammar performs a crucial role in the generation of algorithms because it contains information about the problem, creating algorithms that respect the problem's constraints.

Generation hyper-heuristics have not been deeply investigated in the context of the PSO algorithm design problem. Some works adopted GP to evolve velocity equations to control the particles in a PSO algorithm [24], [25], [26]. Although these works have reached promising results, the generation of infeasible solutions (syntactically incorrect solutions), during the optimization, harmed the search for better solutions. In order to avoid such problem, some previous works adopted GGGP approaches to produce customized PSO algorithms [24], [27], [28], [29]. In a recent work, we adapted and compared two different GGGP approaches for the generation of PSO algorithms [29]. The current paper is an expanded version of this preliminary work, in which we: i) present a more detailed introduction to GGGP approaches, ii) perform a deeper discussion of related work, iii) perform an extra experimental investigation considering four different GGGP techniques in the PSO algorithm generation task, and iv) present a detailed analysis of the experimental results. Our main goal is to contrast the most used GGGP approaches in the generation of adequate PSO algorithms.

The experiments were performed considering 32 unconstrained continuous optimization problems with different levels of difficulty. GGGP was evaluated in three distinct batteries of experiments. First, for each optimization problem, each adopted GGGP approach returns an algorithm (i.e., a PSO optimized for that problem). The optimization results obtained by the produced algorithms were then compared across the GGGP approaches and optimization problems considered. In this way, we identified the best GGGP approach evaluated. Second, for each optimization problem we executed as baselines the optimization algorithms adopted in the competition of the International Conference on Swarm Intelligence 2014 (ICSI 2014) [30]. The optimization results obtained by the baselines were compared to the results obtained by the PSO algorithms produced by the best GGGP approach. The results showed that the algorithms produced by the best GGGP approach achieved better results when compared to those achieved by state-of-art algorithms in the literature. Finally, the third experiment presents the design of the algorithms generated by the CFG-GP for the given problems.

This work is organized as follows. Section 2 presents the basic concepts of GGGP approaches. Section 3 introduces the PSO algorithm and its main components. Section 4 contains related work on optimization of PSO algorithms. Section 5 brings the experimental design of this work. In Section 6, the results are presented and discussed. Finally, in Section 7, the conclusions, and possible future works are presented.

Section snippets

Grammar-based Genetic Programming

Due to the potential of grammars for algorithm optimization, GGGP techniques have received an increasing attention in the GP field over the years. Before going into detail about these GGGP algorithms, the concepts of genotype and phenotype need to be clarified. The term genotype is related to the data structure (vector of variables) manipulated by genetic operators such as mutation and crossover. This is used to generate the phenotype, which is an executable structure (program or algorithm)

PSO algorithm

PSO is a meta-heuristic created in the 90s, inspired by the social behavior of flocks when seeking for food [1]. PSO has obtained an increasing interest over the years due to its competitive results in different applications. PSO optimizes a set of solutions represented by positions in the search space as follows. In the beginning, the position and velocity of a predetermined amount of particles are randomly initialized. At each iteration of the algorithm, each particle moves through the search

PSO algorithm design problem

Although the PSO has shown fast convergence in different applications, some developments were introduced to leverage the basic PSO algorithm. Among the several enhancements to standard PSO, we highlight the creation of new topologies, new velocity equations (e.g., by adopting the inertia weight or the constriction factor) and by using mutation operators [1], [52], [53], [54]. As a consequence of the variety of PSO components and hyper-parameters (e.g., number of particles, acceleration

Experimental design

This section presents the experiments conducted to compare the GGGP approaches on the generation of PSO algorithms. To perform this comparison, we selected four algorithms: GE, GS, and GDE, which use the linear genome representation, and the Context-free Grammar Genetic Programming (CFG-GP) [23], which uses the tree-based representation. We highlight that a tree-based GGGP algorithm has never been tested for the PSO generation, apart from our own previous research. As a baseline, we

Results

This section presents in detail the results obtained by the GGGP approaches when applied to the PSO algorithm design problem. As previously mentioned in Section 5, we performed two investigations. First, we compared the results achieved by each GGGP approach considering all 32 optimization problems. Second, we compared the results achieved by the algorithms produced by the best GGGP approach (from the previous experiment) with the results reached by algorithms adopted in the ICSI 2014,

Conclusion

This paper investigates the problem of generating PSO algorithms. The intention is to use Grammar-Guided Genetic Programming (GGGP) automatically to produce PSO designs that are adequate for a given problem regarding optimization performance. This paper presented the historical development of the GGGP approaches, mentioning their main characteristics and applications. Although GGGP algorithms have shown their potential to produce customized algorithms in other contexts, few works have

Acknowledgments

The authors would like to thank CNPq, CAPES, and FACEPE (Brazilian Agencies) for their financial support.

References (69)

  • E.K. Burke et al.

    Exploring hyper-heuristic methodologies with genetic programming

    Computational intelligence

    (2009)
  • J.R. Woodward et al.

    Template method hyper-heuristics

    Proceedings of the 2014 Conference Companion on Genetic and Evolutionary Computation Companion

    (2014)
  • E.K. Burke et al.

    A genetic programming hyper-heuristic approach for evolving 2-d strip packing heuristics

    IEEE Trans. Evol. Comput.

    (2010)
  • R. Poli et al.

    A histogram-matching approach to the evolution of bin-packing strategies

  • A.S. Fukunaga

    Automated discovery of local search heuristics for satisfiability testing

    Evol. Comput.

    (2008)
  • M. Bader-El-Den et al.

    Generating sat local-search heuristics using a GP hyper-heuristic framework

    Artificial Evolution

    (2008)
  • E.K. Burke et al.

    Hyper-heuristics: a survey of the state of the art

    J. Oper. Res. Soc.

    (2013)
  • X. Yao

    Evolving artificial neural networks

    Proc. IEEE

    (1999)
  • A. Vella et al.

    Hyper-heuristic decision tree induction

  • W. Smart et al.

    Using genetic programming for multiclass classification by simultaneously solving component binary classification problems

    Genetic Programming

    (2005)
  • M.C. Bot et al.

    Application of genetic programming to induction of linear classification trees

    Genetic Programming

    (2000)
  • G. Folino et al.

    Genetic programming and simulated annealing: a hybrid method to evolve decision trees

    Genetic Programming

    (2000)
  • C.C. Bojarczuk et al.

    Discovering comprehensible classification rules by using genetic programming: a case study in a medical domain

  • C.C. Bojarczuk et al.

    Genetic programming for knowledge discovery in chest-pain diagnosis

    IEEE Eng. Med. Biol. Mag.

    (2000)
  • G.L. Pappa et al.

    Automating the Design of Data Mining Algorithms: An Evolutionary Computation Approach

    (2009)
  • R.I. Mckay et al.

    Grammar-based genetic programming: a survey

    Genet. Program. Evolvable Machines

    (2010)
  • R. Poli et al.

    Exploring extended particle swarms: a genetic programming approach

  • R. Poli et al.

    Extending particle swarm optimisation via genetic programming

    European Conference on Genetic Programming

    (2005)
  • M. Rashid et al.

    Adaptable evolutionary particle swarm optimization

    3rd International Conference on Innovative Computing Information and Control, 2008. ICICIC’08

    (2008)
  • T. Si et al.

    Grammatical swarm based-adaptable velocity update equations in particle swarm optimizer

    Proceedings of the International Conference on Frontiers of Intelligent Computing: Theory and Applications (FICTA) 2013

    (2014)
  • P.B. Miranda et al.

    GEFPSO: a framework for PSO optimization based on grammatical evolution

  • P.B. Miranda et al.

    Tree-based grammar genetic programming to evolve particle swarm algorithms

  • Y. Tan, J. Li, Z. Zheng, Introduction and ranking results of the ICSI 2014 competition on single objective...
  • M. O’Neil et al.

    Grammatical evolution

    Grammatical Evolution

    (2003)
  • Cited by (0)

    View full text