A hyper-heuristic approach to automated generation of mutation operators for evolutionary programming
Introduction
Black-box function optimisation is the task of finding the optima of an objective function for which we do not have access to an analytical form. This paper is concerned with evolutionary programming (EP) [1], which evolves a population of real-valued input vectors for a function, a technique widely applied to real-world problems [2], [3], [4]. As EP has an evolutionary basis, each vector undergoes selection, evaluation, and mutation, with the expectation that fitter and fitter vectors are obtained. Here we focus on the mutation component of EP, which in the past has been designed manually. A hyper-heuristic is a search method or learning mechanism for selecting or generating heuristics to solve computational search problems [5]. Genetic programming (GP) [6] is a population-based evolutionary computation method for evolving program trees, that has frequently been used as a generation hyper-heuristic in the literature [7].
In this paper, we use GP as an offline generation hyper-heuristic to automatically create mutation operators for EP operating on function classes. A mutation operator in EP is a probability distribution, represented as a random number generator. We present an algorithmic framework which can not only express a number of currently existing EP mutation operators, but also generate novel variants of EP mutation operators. Using a train-and-test approach, GP is used to evolve mutation operators for EP, using a training set drawn from a class of functions which is then validated on a larger set of unseen instances taken from the same class. We use the term automatically designed mutation operators (ADMs) to describe the mutation operators generated by GP. We demonstrate that the ADMs for EP are capable of comparable, and often superior, performance to existing human designed operators. An additional set of experiments that takes ADMs that are trained on one function class, but then tested on a different function class is also conducted to further examine the performance of the evolved ADMs.
This current paper builds on previous papers. Hong et al. [8] first demonstrated that GP could automatically construct random number generators which are typically used in EP. In a second paper, it was shown that ADMs could be trained on collections of functions classes, showing good performance across a broader range of functions [9], however a tradeoff between general training and specific performance was observed. This paper presents a study of the design of 23 ADMs, for 23 functions classes, and then tests each of the 23 ADMs on each of the function classes. Real-valued optimisation is an active research topic and several population-based metaheuristics have been applied to function optimisation, including differential evolution and its variants [10], [11], particle swarm optimisation [12], covariance matrix adaptation evolution strategy (CMA-ES) [13] and hybrid methods [14]. We acknowledge the existence of these algorithms. However, as this study is concerned specifically with the modification of EP, a full review of all of these algorithms is beyond the scope of this paper.
The outline of the remainder of this paper is as follows. In Section 2, we give the background to the proposed approach of automatically designing algorithms using GP-based hyper-heuristics. In Section 3 we consider the task of function optimisation and introduce the notion of a function class. Section 4 presents our experimental results, which are analysed in Section 5. In Section 6 we discuss the research presented and in Section 7 we summarise the article and outline potential further research directions.
Section snippets
Hyper-heuristics and automated generation of heuristics
The key distinction between metaheuristics and hyper-heuristics is that the former operate directly on the solution search space, while the latter operate indirectly on the solution search space, working with a set of low-level heuristics or heuristic components. Hyper-heuristics come in two main types: heuristics to choose heuristics and heuristics to generate heuristics [5]. In addition to the broad distinction between selection and generation hyper-heuristics, it is also possible to classify
Optimisation and function classes
In this section we discuss optimisation with evolutionary programming (EP), and then introduce function classes as probability distributions over functions. Function classes are central to this paper, differentiating our approach from the standard convention of benchmarking on arbitrary functions. Rather than demonstrating the utility of an optimisation algorithm for specific arbitrary functions, we demonstrate the utility of an ADM on a set of functions are drawn from a fixed probability
Experimental design
In this section we describe the experimental set-up of GP and EP. With hyper-heuristic approaches, it is important to identify the two levels at which the heuristics operate. In a typical hyper-heuristic, a metaheuristic operates on a space of (meta) heuristics, which operate directly on the space of solutions. Here we use GP as a mutation operator generator at the hyper-level to manipulate the mutation operators within a population of EP algorithms working at the base level. The overall
Analysis of the performance of the automatically designed mutation operators
Table 5 reports the average best values obtained over 50 EP runs of each of the 23 function classes using a number of different mutation operators. The corresponding standard deviations are shown underneath each mean value in parentheses. These values are displayed for Cauchy (FEP, Lévy with α = 1.0), Lévy with α = 1.2, α = 1.4, α = 1.6 and α = 1.8, and Gaussian (CEP, Lévy with α = 2.0), as well as the best ADM evolved by GP for that function class. The best values (lowest, as we are minimising) are in
Discussion
One of the advantages of the new method presented here is that it eliminates the need for human researchers to continually propose new distributions for use as mutation operators in EP. Instead, we have a search space which contains a rich set of mutation operators, and we can let a metaheuristic, such as GP, sample this space and select a suitable choice for the sample of functions at hand. In addition, it designs an ADM within the context of a function class. In other words, it tailors a
Summary and future work
In this paper we have used genetic programming (GP) as an offline hyper-heuristic to automatically evolve probability distributions, to use as mutation operators in evolutionary programming (EP). This is in contrast to existing operators in the literature which are human designed. The function and terminal set for GP was chosen to be able to express a number of currently existing human designed mutation operators, namely Cauchy, Gaussian and Lévy, and also express novel automatically designed
Acknowledgements
This work has been partially funded by the DAASE project, EPSRC grant EP/J017515/1, and the FAIME project, EPSRC grant EP/N002849/1.
References (43)
- et al.
A multiobjective evolutionary programming framework for graph-based data mining
Inform. Sci.
(2013) - et al.
Differential evolution algorithm with ensemble of parameters and mutation strategies
Appl. Soft Comput.
(2011) - et al.
Adaptive configuration of evolutionary algorithms for constrained optimization
Appl. Math. Comput.
(2013) - et al.
Evolutionary programming using a mixed mutation strategy
Inform. Sci.
(2007) - et al.
Ensemble strategies with adaptive evolutionary programming
Inform. Sci.
(2010) - et al.
The irace package: iterated racing for automatic algorithm configuration
Oper. Res. Perspect.
(2016) - et al.
Fast evolutionary programming
Proceedings of the Fifth Annual Conference on Evolutionary Programming
(1996) - et al.
Evolutionary programming techniques for economic load dispatch
IEEE Trans. Evol. Comput.
(2003) - et al.
Taboo evolutionary programming approach to optimal transfer from Earth to Mars
Proceedings of Swarm, Evolutionary, and Memetic Computing (SEMCCO 2011) – Part II, Vol. 7077 of LNCS
(2011) - et al.
A classification of hyper-heuristic approaches
Exploring hyper-heuristic methodologies with genetic programming
Automated design of probability distributions as mutation operators for evolutionary programming using genetic programming
Genetic Programming, Vol. 7831 of LNCS
Automatically designing more general mutation operators of evolutionary programming for groups of function classes using a hyper-heuristic
Differential evolution algorithm with strategy adaptation for global numerical optimization
IEEE Trans. Evol. Comput.
Particle swarm optimization
The CMA evolution strategy: a comparing review
Towards a New Evolutionary Computation, Vol. 192 of Studies in Fuzziness and Soft Computing
Automatically designing selection heuristics
The automatic generation of mutation operators for genetic algorithms
Evolving crossover operators for function optimization
Automatic design of scheduling policies for dynamic multi-objective job shop scheduling via cooperative coevolution genetic programming
IEEE Trans. Evol. Comput.
Cited by (33)
Ant-colony optimization for automating test model generation in model transformation testing
2024, Journal of Systems and SoftwareA comprehensive review of automatic programming methods
2023, Applied Soft ComputingAmeliorated Follow The Leader: Algorithm and Application to Truss Design Problem
2022, StructuresCitation Excerpt :Optimization algorithms such as Genetic Algorithm (GA) [3], Particle Swarm Optimization (PSO) [4,5], Arithmetic Optimization Algorithm (AOA) [6], Follow the Leader (FTL) [7] have proven themselves better when combined with traditional approaches such as Artificial Neural Network (ANN) [8], Linear Regression (LR) [9], Support Vector Machine (SVM) [10], etc. Optimization algorithms can be categorised into evolution-based (GA [32], Evolutionary Programming (EP) [33], Differential Evolution (DE) [34], etc), swarm-based (Particle Swarm Optimization (PSO) [35], Ant Bee Colony (ABC), Dolphin Echolocation (DE) [36], Political Optimizer (PO) [37], Bee Colony Optimization (BCO) [13,38], etc) and others (Jaya algorithm [39], Vibrating Particles System (VPS) [40], Random Search [41], Simulated Annealing (SA) [42], Evaporation Optimization (WEO) [43], Colliding Bodies Optimization (CBO) [44], Charged System Search (CSS) [45], etc). The swarm-based algorithms have drawn huge attention from researchers due to their inherent ability to solve complex problems.
Adaptive search space to generate a per-instance genetic algorithm for the permutation flow shop problem
2022, Applied Soft ComputingCitation Excerpt :Experimentation was done on three different optimization problems and showed to be competitive to manually designed algorithms. Moreover, Hong et al. [29] proposed a generation hyper-heuristic based on genetic programming to create new mutation operators for solving optimization problem functions. The empirical study proved that a mutation operator tailored for a specific problem was better than human designed mutations including Gaussian, Cauchy and Lévy.
FPGA Accelerated Parallel HsClone GA for Digital Circuit Configuration in CGP Format
2023, Journal of The Institution of Engineers (India): Series B