Created by W.Langdon from gp-bibliography.bib Revision:1.8051
The main schemes for building empirical emulators are discussed in the paper. Several contemporary techniques for robust empirical emulator design are explored including analytic neural networks, recurrent neural networks and Genetic Programming (GP), and the capabilities of the proposed approach are illustrated with a case study for a simple first principle model.
A key feature of empirical emulators is that the training data for empirical model building is generated by design of experiments from first principle models called simulators. This allows a high degree of freedom for development of reliable data-driven models. The most obvious scheme for implementation of empirical emulators is as accelerator of computational time for fundamental models (the gain is 103 to 105 times faster). Another possible scheme is to use the empirical emulator as an estimator of fundamental model performance. Of special importance to on-line optimization is a scheme using the empirical emulator to integrate different types of fundamental models (steady-state, dynamic, fluid, kinetic, thermal, etc). Most of the known empirical emulators are implemented as {"}classical{"} neural networks based on back-propagation learning algorithm. Their property of being universal approximators is a key theoretical result for successful emulation. At the same time {"}classical{"} neural networks suffer from a number of problems like: long computational time for training, convergence to local minima, sensitivity to weight generalization, too many tunable parameters, etc. These problems put serious limitations on the quality of the developed empirical model, increase development time, and require experienced model developers. An alternative empirical emulator based on analytic neural networks is described in the paper. A key advantage of analytic neural networks is that the function to be optimized is a quadratic function of the weights of the hidden-to-output layer error and has one global optimum. It is no longer possible to get stuck in local minima and the learning algorithm is not iterative. As a result, the data-driven modeling process is significantly reduced and the developed empirical models are parsimonious. Of special importance to empirical emulator's performance is the ability of analytic neural networks to deliver multiple-model solution with confidence limits. Empirical emulators with confidence limits are aware of their own performance which is essential for any data-driven model application, especially in real-time. In the case of emulating process dynamics a different type of recurrent neural networks are needed. Recurrent networks are neural networks with one or more local or global feedback loops. The application of feedback enables neural networks to acquire state representations, making them suitable for emulation of dynamic fundamental models. A proper structure of an empirical emulator to mimic dynamic behavior is based on a recurrent version of the analytic neural networks.",
Genetic Programming entries for Peter Kip Mercure Guido F Smits Arthur K Kordon