Elsevier

Neurocomputing

Volume 72, Issues 10–12, June 2009, Pages 2385-2391
Neurocomputing

Solving differential equations with constructed neural networks

https://doi.org/10.1016/j.neucom.2008.12.004Get rights and content

Abstract

A novel hybrid method for the solution of ordinary and partial differential equations is presented here. The method creates trial solutions in neural network form using a scheme based on grammatical evolution. The trial solutions are enhanced periodically using a local optimization procedure. The proposed method is tested on a series of ordinary differential equations, systems of ordinary differential equations as well as on partial differential equations with Dirichlet boundary conditions and the results are reported.

Introduction

A series of problems in many scientific fields can be modelled with the use of differential equations such as problems in physics [1], [2], [3], [4], [5], chemistry [6], [7], [8], biology [9], [10], economics [11], etc. Due to the importance of differential equations many methods have been proposed in the relevant literature for their solution such as Runge Kutta methods [12], [13], [14], Predictor–Corrector [15], [16], [17], radial basis functions [18], [19], artificial neural networks [20], [21], [22], [23], [24], [25], [26], [27], models based on genetic programming [28], [29], etc. In this article a hybrid method utilizing constructed feed-forward neural networks by grammatical evolution and a local optimization procedure is used in order to solve ordinary differential equations (ODEs), systems of ordinary differential equations (SODEs) and partial differential equations (PDEs). The constructed neural networks with grammatical evolution have been recently introduced by Tsoulos et al. [30] and it utilizes the well-established grammatical evolution technique [31] to evolve the neural network topology along with the network parameters. The method has been tested with success on a series of data-fitting and classifications problems. In this article the constructed neural network methodology is applied on a series of differential equations while preserving the initial or boundary conditions using penalization. The proposed method does not require the user to enter any information regarding the topology of the network. Also, the new method can be used to solve either ODEs or PDEs and it can be easily parallelized. This idea is similar to the cascade correlation neural networks introduced by Fahlman and Lebiere [32] in which the user is not required to enter any topology information. However, the method for selecting the network topology differs since the proposed algorithm is a stochastic one. In the proposed method, the advantage of using an evolutionary algorithm is that the penalty function (used for initial or boundary conditions) can be incorporated easily into the training process.

The rest of this article is organized as follows: in Section 2 a brief description of the grammatical evolution algorithm is given followed by an analytical description of the proposed method, in Section 3 the test functions used in the experiments followed by the experimental results are outlined and in Section 4 some conclusions are derived.

Section snippets

Method description

In this section a brief description of the grammatical evolution algorithm is given. The main steps of the proposed algorithm are outlined with the steps for the fitness evaluation for the cases of ODEs, SODEs and PDEs.

Experiments

The proposed method was tested on a series of ODEs, non-linear ODEs, SODEs and PDEs with two variables and Dirichlet boundary conditions. These test functions are listed subsequently and they have been used in the experiments performed in [20], [29]. In all the experiments, the ODEs were sampled using a uniform distribution and only 1000 samples were extracted in the intervals of x for each case. In the following subsections, the ODEs are presented for each one of the four series.

Conclusions

In conclusion, a novel method for solving ODEs, PDEs and SODEs is presented. This method utilizes neural networks that are constructed using artificial neural networks that are constructed using grammatical evolution. This novel method for simultaneously constructing and training neural networks has been used successfully in other domains. Concerning the differential equations problem, a series of experiments in 19 well-known problems, showed that the proposed method managed to solve all the

Acknowledgements

All the experiments of this paper were conducted at the Research Center of Scientific Simulations of the University of Ioannina, which is composed of 200 computing nodes with dual CPUs (AMD OPTERON 2.2 GHZ 64bit) running Redhat Enterprise Linux.

Ioannis Tsoulos received his Ph.D. degree from the Department of Computer Science of the University of Ioannina in 2006. He is currently a Visiting Lecturer at the Department of Communications, Informatics and Management of the Technological Educational Institute of Epirus. His research interest areas include optimization, genetic programming and neural networks.

References (39)

  • S. He et al.

    Multilayer neural networks for solving a class of partial differential equations

    Neural Networks

    (2000)
  • A.J. Meade et al.

    Solution of nonlinear ordinary differential equations by feedforward neural networks

    Mathematical and Computer Modelling

    (1994)
  • A.J. Meade et al.

    The numerical solution of linear ordinary differential equations by feedforward neural networks

    Mathematical and Computer Modelling

    (1994)
  • I. Tsoulos et al.

    Neural network construction and training using grammatical evolution

    Neurocomputing

    (2008)
  • A.R. Its et al.

    Differential equations for quantum correlation functions

    International Journal of Modern Physics B

    (1990)
  • H. Gang et al.

    Controlling chaos in systems described by partial differential equations

    Phys. Rev. Lett.

    (1993)
  • C.J. Budd et al.

    Geometric integration: numerical solution of differential equations on manifolds, philosophical transactions: mathematical

    Physical and Engineering Sciences: Mathematical

    (1999)
  • U. Salzner et al.

    Numerical solution of a partial differential equation system describing chemical kinetics and diffusion in a cell with the aid of compartmentalization

    Journal of Computational Chemistry

    (1990)
  • J.C. Butcher

    The Numerical Analysis of Ordinary Differential Equations: Runge–Kutta and General Linear Methods

    (1987)
  • Cited by (85)

    View all citing articles on Scopus

    Ioannis Tsoulos received his Ph.D. degree from the Department of Computer Science of the University of Ioannina in 2006. He is currently a Visiting Lecturer at the Department of Communications, Informatics and Management of the Technological Educational Institute of Epirus. His research interest areas include optimization, genetic programming and neural networks.

    Dimitris Gavrilis was born in Athens in 1978. He graduated from Electrical and Computer Engineering Department in 2002. In 2007 he received his Ph.D. in “Denial-of-Service attacks detection”. His research interests include: network intrusion detection, feature selection and construction for pattern recognition, evolutionary neural networks. He currently works as a researcher at Athena Research Centre.

    Euripidis Glavas received his Ph.D. degree from the University of Sussex in 1988. He is currently an associate professor in the Department of Communications, Informatics and Management of the Technological Educational Institute of Epirus. His research areas include optical wave guide, computer architecture and neural networks.

    View full text