Solving differential equations with constructed neural networks
Introduction
A series of problems in many scientific fields can be modelled with the use of differential equations such as problems in physics [1], [2], [3], [4], [5], chemistry [6], [7], [8], biology [9], [10], economics [11], etc. Due to the importance of differential equations many methods have been proposed in the relevant literature for their solution such as Runge Kutta methods [12], [13], [14], Predictor–Corrector [15], [16], [17], radial basis functions [18], [19], artificial neural networks [20], [21], [22], [23], [24], [25], [26], [27], models based on genetic programming [28], [29], etc. In this article a hybrid method utilizing constructed feed-forward neural networks by grammatical evolution and a local optimization procedure is used in order to solve ordinary differential equations (ODEs), systems of ordinary differential equations (SODEs) and partial differential equations (PDEs). The constructed neural networks with grammatical evolution have been recently introduced by Tsoulos et al. [30] and it utilizes the well-established grammatical evolution technique [31] to evolve the neural network topology along with the network parameters. The method has been tested with success on a series of data-fitting and classifications problems. In this article the constructed neural network methodology is applied on a series of differential equations while preserving the initial or boundary conditions using penalization. The proposed method does not require the user to enter any information regarding the topology of the network. Also, the new method can be used to solve either ODEs or PDEs and it can be easily parallelized. This idea is similar to the cascade correlation neural networks introduced by Fahlman and Lebiere [32] in which the user is not required to enter any topology information. However, the method for selecting the network topology differs since the proposed algorithm is a stochastic one. In the proposed method, the advantage of using an evolutionary algorithm is that the penalty function (used for initial or boundary conditions) can be incorporated easily into the training process.
The rest of this article is organized as follows: in Section 2 a brief description of the grammatical evolution algorithm is given followed by an analytical description of the proposed method, in Section 3 the test functions used in the experiments followed by the experimental results are outlined and in Section 4 some conclusions are derived.
Section snippets
Method description
In this section a brief description of the grammatical evolution algorithm is given. The main steps of the proposed algorithm are outlined with the steps for the fitness evaluation for the cases of ODEs, SODEs and PDEs.
Experiments
The proposed method was tested on a series of ODEs, non-linear ODEs, SODEs and PDEs with two variables and Dirichlet boundary conditions. These test functions are listed subsequently and they have been used in the experiments performed in [20], [29]. In all the experiments, the ODEs were sampled using a uniform distribution and only 1000 samples were extracted in the intervals of x for each case. In the following subsections, the ODEs are presented for each one of the four series.
Conclusions
In conclusion, a novel method for solving ODEs, PDEs and SODEs is presented. This method utilizes neural networks that are constructed using artificial neural networks that are constructed using grammatical evolution. This novel method for simultaneously constructing and training neural networks has been used successfully in other domains. Concerning the differential equations problem, a series of experiments in 19 well-known problems, showed that the proposed method managed to solve all the
Acknowledgements
All the experiments of this paper were conducted at the Research Center of Scientific Simulations of the University of Ioannina, which is composed of 200 computing nodes with dual CPUs (AMD OPTERON 2.2 GHZ 64bit) running Redhat Enterprise Linux.
Ioannis Tsoulos received his Ph.D. degree from the Department of Computer Science of the University of Ioannina in 2006. He is currently a Visiting Lecturer at the Department of Communications, Informatics and Management of the Technological Educational Institute of Epirus. His research interest areas include optimization, genetic programming and neural networks.
References (39)
Differential equations method: the calculation of vertex-type Feynman diagrams
Physics Letters B
(1991)Exact solutions for some nonlinear partial differential equations
Physics Letters A
(2003)- et al.
A comparison of stiff ODE solvers for atmospheric chemistry problems
Atmospheric Environment
(1996) - et al.
A new approximate whole boundary solution of the Lamm differential equation for the analysis of sedimentation velocity experiments
Biophysical Chemistry
(2002) - et al.
A delay-differential equation model of HIV infection of CD4+ T-cells
Mathematical Biosciences
(2000) - et al.
Numerical modelling in biosciences using delay differential equations
Journal of Computational and Applied Mathematics
(2000) Differential equations for moments of present values in life insurance
Insurance: Mathematics and Economics
(1995)Explicit Runge–Kutta methods for parabolic partial differential equations
Applied Numerical Mathematics
(1996)- et al.
Solving partial differential equations by collocation using radial basis functions
Applied Mathematics and Computation
(1998) - et al.
Solving differential equations with unsupervised neural networks
Chemical Engineering and Processing: Process Intensification
(2003)
Multilayer neural networks for solving a class of partial differential equations
Neural Networks
Solution of nonlinear ordinary differential equations by feedforward neural networks
Mathematical and Computer Modelling
The numerical solution of linear ordinary differential equations by feedforward neural networks
Mathematical and Computer Modelling
Neural network construction and training using grammatical evolution
Neurocomputing
Differential equations for quantum correlation functions
International Journal of Modern Physics B
Controlling chaos in systems described by partial differential equations
Phys. Rev. Lett.
Geometric integration: numerical solution of differential equations on manifolds, philosophical transactions: mathematical
Physical and Engineering Sciences: Mathematical
Numerical solution of a partial differential equation system describing chemical kinetics and diffusion in a cell with the aid of compartmentalization
Journal of Computational Chemistry
The Numerical Analysis of Ordinary Differential Equations: Runge–Kutta and General Linear Methods
Cited by (85)
A nonlinear solver based on an adaptive neural network, introduction and application to porous media flow
2021, Journal of Natural Gas Science and EngineeringOn the feed-forward neural network for analyzing pantograph equations
2024, AIP Advances
Ioannis Tsoulos received his Ph.D. degree from the Department of Computer Science of the University of Ioannina in 2006. He is currently a Visiting Lecturer at the Department of Communications, Informatics and Management of the Technological Educational Institute of Epirus. His research interest areas include optimization, genetic programming and neural networks.
Dimitris Gavrilis was born in Athens in 1978. He graduated from Electrical and Computer Engineering Department in 2002. In 2007 he received his Ph.D. in “Denial-of-Service attacks detection”. His research interests include: network intrusion detection, feature selection and construction for pattern recognition, evolutionary neural networks. He currently works as a researcher at Athena Research Centre.
Euripidis Glavas received his Ph.D. degree from the University of Sussex in 1988. He is currently an associate professor in the Department of Communications, Informatics and Management of the Technological Educational Institute of Epirus. His research areas include optical wave guide, computer architecture and neural networks.