Elsevier

Applied Soft Computing

Volume 62, January 2018, Pages 162-175
Applied Soft Computing

A hyper-heuristic approach to automated generation of mutation operators for evolutionary programming

https://doi.org/10.1016/j.asoc.2017.10.002Get rights and content

Highlights

  • Designing mutation operators for evolutionary programming involves much manual effort.

  • Genetic programming is used to evolve mutation operators for evolutionary programming.

  • A train-and-test approach is used to evaluate performance over 23 function classes.

  • The evolved mutation operators outperform existing operators on classes of functions.

  • Operators evolved for specific classes also outperform other evolved operators.

Abstract

Evolutionary programming can solve black-box function optimisation problems by evolving a population of numerical vectors. The variation component in the evolutionary process is supplied by a mutation operator, which is typically a Gaussian, Cauchy, or Lévy probability distribution. In this paper, we use genetic programming to automatically generate mutation operators for an evolutionary programming system, testing the proposed approach over a set of function classes, which represent a source of functions. The empirical results over a set of benchmark function classes illustrate that genetic programming can evolve mutation operators which generalise well from the training set to the test set on each function class. The proposed method is able to outperform existing human designed mutation operators with statistical significance in most cases, with competitive results observed for the rest.

Introduction

Black-box function optimisation is the task of finding the optima of an objective function for which we do not have access to an analytical form. This paper is concerned with evolutionary programming (EP) [1], which evolves a population of real-valued input vectors for a function, a technique widely applied to real-world problems [2], [3], [4]. As EP has an evolutionary basis, each vector undergoes selection, evaluation, and mutation, with the expectation that fitter and fitter vectors are obtained. Here we focus on the mutation component of EP, which in the past has been designed manually. A hyper-heuristic is a search method or learning mechanism for selecting or generating heuristics to solve computational search problems [5]. Genetic programming (GP) [6] is a population-based evolutionary computation method for evolving program trees, that has frequently been used as a generation hyper-heuristic in the literature [7].

In this paper, we use GP as an offline generation hyper-heuristic to automatically create mutation operators for EP operating on function classes. A mutation operator in EP is a probability distribution, represented as a random number generator. We present an algorithmic framework which can not only express a number of currently existing EP mutation operators, but also generate novel variants of EP mutation operators. Using a train-and-test approach, GP is used to evolve mutation operators for EP, using a training set drawn from a class of functions which is then validated on a larger set of unseen instances taken from the same class. We use the term automatically designed mutation operators (ADMs) to describe the mutation operators generated by GP. We demonstrate that the ADMs for EP are capable of comparable, and often superior, performance to existing human designed operators. An additional set of experiments that takes ADMs that are trained on one function class, but then tested on a different function class is also conducted to further examine the performance of the evolved ADMs.

This current paper builds on previous papers. Hong et al. [8] first demonstrated that GP could automatically construct random number generators which are typically used in EP. In a second paper, it was shown that ADMs could be trained on collections of functions classes, showing good performance across a broader range of functions [9], however a tradeoff between general training and specific performance was observed. This paper presents a study of the design of 23 ADMs, for 23 functions classes, and then tests each of the 23 ADMs on each of the function classes. Real-valued optimisation is an active research topic and several population-based metaheuristics have been applied to function optimisation, including differential evolution and its variants [10], [11], particle swarm optimisation [12], covariance matrix adaptation evolution strategy (CMA-ES) [13] and hybrid methods [14]. We acknowledge the existence of these algorithms. However, as this study is concerned specifically with the modification of EP, a full review of all of these algorithms is beyond the scope of this paper.

The outline of the remainder of this paper is as follows. In Section 2, we give the background to the proposed approach of automatically designing algorithms using GP-based hyper-heuristics. In Section 3 we consider the task of function optimisation and introduce the notion of a function class. Section 4 presents our experimental results, which are analysed in Section 5. In Section 6 we discuss the research presented and in Section 7 we summarise the article and outline potential further research directions.

Section snippets

Hyper-heuristics and automated generation of heuristics

The key distinction between metaheuristics and hyper-heuristics is that the former operate directly on the solution search space, while the latter operate indirectly on the solution search space, working with a set of low-level heuristics or heuristic components. Hyper-heuristics come in two main types: heuristics to choose heuristics and heuristics to generate heuristics [5]. In addition to the broad distinction between selection and generation hyper-heuristics, it is also possible to classify

Optimisation and function classes

In this section we discuss optimisation with evolutionary programming (EP), and then introduce function classes as probability distributions over functions. Function classes are central to this paper, differentiating our approach from the standard convention of benchmarking on arbitrary functions. Rather than demonstrating the utility of an optimisation algorithm for specific arbitrary functions, we demonstrate the utility of an ADM on a set of functions are drawn from a fixed probability

Experimental design

In this section we describe the experimental set-up of GP and EP. With hyper-heuristic approaches, it is important to identify the two levels at which the heuristics operate. In a typical hyper-heuristic, a metaheuristic operates on a space of (meta) heuristics, which operate directly on the space of solutions. Here we use GP as a mutation operator generator at the hyper-level to manipulate the mutation operators within a population of EP algorithms working at the base level. The overall

Analysis of the performance of the automatically designed mutation operators

Table 5 reports the average best values obtained over 50 EP runs of each of the 23 function classes using a number of different mutation operators. The corresponding standard deviations are shown underneath each mean value in parentheses. These values are displayed for Cauchy (FEP, Lévy with α = 1.0), Lévy with α = 1.2, α = 1.4, α = 1.6 and α = 1.8, and Gaussian (CEP, Lévy with α = 2.0), as well as the best ADM evolved by GP for that function class. The best values (lowest, as we are minimising) are in

Discussion

One of the advantages of the new method presented here is that it eliminates the need for human researchers to continually propose new distributions for use as mutation operators in EP. Instead, we have a search space which contains a rich set of mutation operators, and we can let a metaheuristic, such as GP, sample this space and select a suitable choice for the sample of functions at hand. In addition, it designs an ADM within the context of a function class. In other words, it tailors a

Summary and future work

In this paper we have used genetic programming (GP) as an offline hyper-heuristic to automatically evolve probability distributions, to use as mutation operators in evolutionary programming (EP). This is in contrast to existing operators in the literature which are human designed. The function and terminal set for GP was chosen to be able to express a number of currently existing human designed mutation operators, namely Cauchy, Gaussian and Lévy, and also express novel automatically designed

Acknowledgements

This work has been partially funded by the DAASE project, EPSRC grant EP/J017515/1, and the FAIME project, EPSRC grant EP/N002849/1.

References (43)

  • R. Poli, W.B. Langdon, N.F. McPhee, A field guide to genetic programming, Published via http://lulu.com and freely...
  • E. Burke et al.

    Exploring hyper-heuristic methodologies with genetic programming

  • L. Hong et al.

    Automated design of probability distributions as mutation operators for evolutionary programming using genetic programming

    Genetic Programming, Vol. 7831 of LNCS

    (2013)
  • L. Hong et al.

    Automatically designing more general mutation operators of evolutionary programming for groups of function classes using a hyper-heuristic

  • A.K. Qin et al.

    Differential evolution algorithm with strategy adaptation for global numerical optimization

    IEEE Trans. Evol. Comput.

    (2009)
  • J. Kennedy et al.

    Particle swarm optimization

  • N. Hansen

    The CMA evolution strategy: a comparing review

    Towards a New Evolutionary Computation, Vol. 192 of Studies in Fuzziness and Soft Computing

    (2006)
  • J.R. Woodward et al.

    Automatically designing selection heuristics

  • J.R. Woodward et al.

    The automatic generation of mutation operators for genetic algorithms

  • L. Dioşan et al.

    Evolving crossover operators for function optimization

  • S. Nguyen et al.

    Automatic design of scheduling policies for dynamic multi-objective job shop scheduling via cooperative coevolution genetic programming

    IEEE Trans. Evol. Comput.

    (2013)
  • Cited by (33)

    • Ameliorated Follow The Leader: Algorithm and Application to Truss Design Problem

      2022, Structures
      Citation Excerpt :

      Optimization algorithms such as Genetic Algorithm (GA) [3], Particle Swarm Optimization (PSO) [4,5], Arithmetic Optimization Algorithm (AOA) [6], Follow the Leader (FTL) [7] have proven themselves better when combined with traditional approaches such as Artificial Neural Network (ANN) [8], Linear Regression (LR) [9], Support Vector Machine (SVM) [10], etc. Optimization algorithms can be categorised into evolution-based (GA [32], Evolutionary Programming (EP) [33], Differential Evolution (DE) [34], etc), swarm-based (Particle Swarm Optimization (PSO) [35], Ant Bee Colony (ABC), Dolphin Echolocation (DE) [36], Political Optimizer (PO) [37], Bee Colony Optimization (BCO) [13,38], etc) and others (Jaya algorithm [39], Vibrating Particles System (VPS) [40], Random Search [41], Simulated Annealing (SA) [42], Evaporation Optimization (WEO) [43], Colliding Bodies Optimization (CBO) [44], Charged System Search (CSS) [45], etc). The swarm-based algorithms have drawn huge attention from researchers due to their inherent ability to solve complex problems.

    • Adaptive search space to generate a per-instance genetic algorithm for the permutation flow shop problem

      2022, Applied Soft Computing
      Citation Excerpt :

      Experimentation was done on three different optimization problems and showed to be competitive to manually designed algorithms. Moreover, Hong et al. [29] proposed a generation hyper-heuristic based on genetic programming to create new mutation operators for solving optimization problem functions. The empirical study proved that a mutation operator tailored for a specific problem was better than human designed mutations including Gaussian, Cauchy and Lévy.

    • FPGA Accelerated Parallel HsClone GA for Digital Circuit Configuration in CGP Format

      2023, Journal of The Institution of Engineers (India): Series B
    View all citing articles on Scopus
    View full text