Skip to main content

Highly Accurate Symbolic Regression with Noisy Training Data

  • Chapter
  • First Online:
Genetic Programming Theory and Practice XIII

Part of the book series: Genetic and Evolutionary Computation ((GEVO))

Abstract

As symbolic regression (SR) has advanced into the early stages of commercial exploitation, the poor accuracy of SR, still plaguing even the most advanced commercial packages, has become an issue for early adopters. Users expect to have the correct formula returned, especially in cases with zero noise and only one basis function with minimally complex grammar depth.

At a minimum, users expect the response surface of the SR tool to be easily understood, so that the user can know a priori on what classes of problems to expect excellent, average, or poor accuracy. Poor or unknown accuracy is a hindrance to greater academic and industrial acceptance of SR tools.

In two previous papers, we published a complex algorithm for modern symbolic regression which is extremely accurate for a large class of Symbolic Regression problems. The class of problems, on which SR is extremely accurate, is described in detail in these two previous papers. This algorithm is extremely accurate, in reasonable time on a single processor, for from 25 up to 3000 features (columns).

Extensive statistically correct, out of sample training and testing, demonstrated the extreme accuracy algorithm’s advantages over a previously published base line Pareto algorithm in case where the training and testing data contained zero noise.

While the algorithm’s extreme accuracy for deep problems with a large number of features on noiseless training data is an impressive advance, there are many very important academic and industrial SR problems where the training data is very noisy.

In this chapter we test the extreme accuracy algorithm and compare the results with the previously published baseline Pareto algorithm. Both algorithms’ performance are compared on a set of complex representative problems (from 25 to 3000 features), on noiseless training, on noisy training data, and on noisy training data with range shifted testing data.

The enhanced algorithm is shown to be robust, with definite advantages over the baseline Pareto algorithm, performing well even in the face of noisy training data and range shifted testing data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Korns MF (2010) Abstract expression grammar symbolic regression. In: Riolo R, McConaghy T, Vladislavleva E (eds.) Genetic programming theory and practice VIII. Genetic and evolutionary computation, Chap. 7, vol. 8. Springer, Ann Arbor, p 109–128. http://www.springer.com/computer/ai/book/978-1-4419-7746-5

    Chapter  Google Scholar 

  • Korns MF (2011) Accuracy in symbolic regression. In: Riolo R, Vladislavleva E, Moore JH (eds.) Genetic programming theory and practice IX. Genetic and evolutionary computation, Chap. 8 Springer, Ann Arbor, pp 129–151. doi:10.1007/978-1-4614-1770-5_8

    Chapter  Google Scholar 

  • Korns MF (2012) A baseline symbolic regression algorithm. In: Riolo R, Vladislavleva E, Ritchie MD, Moore JH (eds.) Genetic programming theory and practice X. Genetic and evolutionary computation, Chap. 9 Springer, Ann Arbor, pp 117–137. doi:10.1007/978-1-4614-6846-2_9. http://dx.doi.org/10.1007/978-1-4614-6846-2_9

    Google Scholar 

  • Korns MF (2013) Extreme accuracy in symbolic regression. In: Riolo R, Moore JH, Kotanchek M (eds.) Genetic programming theory and practice XI. Genetic and evolutionary computation, Chap. 1 Springer, Ann Arbor, pp 1–30. doi:10.1007/978-1-4939-0375-7_1

    Google Scholar 

  • Korns MF (2014) Extremely accurate symbolic regression for large feature problems. In: Riolo R, Worzel WP, Kotanchek M (eds.) Genetic programming theory and practice XII. Genetic and evolutionary computation. Springer, Ann Arbor, pp 109–131. doi:10.1007/978-3-319-16030-6_7

    Google Scholar 

  • Kotanchek M, Smits G, Vladislavleva E (2007) Trustable symbolic regression models: using ensembles, interval arithmetic and pareto fronts to develop robust and trust-aware models. In: Riolo RL, Soule T, Worzel B (eds.) Genetic programming theory and practice V. Genetic and evolutionary computation, Chap. 12 Springer, Ann Arbor, pp 201–220. doi:10.1007/978-0-387-76308-8_12

    Google Scholar 

  • Koza JR (1992) Genetic programming: on the programming of computers by means of natural selection. MIT Press, Cambridge, MA. http://mitpress.mit.edu/books/genetic-programming

    MATH  Google Scholar 

  • McConaghy T (2011). FFX: fast, scalable, deterministic symbolic regression technology. In: Riolo R, Vladislavleva E, Moore JH (eds.) Genetic programming theory and practice IX. Genetic and evolutionary computation, Chap. 13 Springer, Ann Arbor, pp 235–260. doi: 10.1007/978-1-4614-1770-5_13. http://trent.st/content/2011-GPTP-FFX-paper.pdf

    Google Scholar 

  • Nelder JA, Wedderburn RW (1972) Generalized linear models. J R Stat Soc Ser A 135(3):370–384

    Article  Google Scholar 

  • Smits G, Kordon A, Vladislavleva K, Jordaan E, Kotanchek M (2005) Variable selection in industrial datasets using pareto genetic programming. In: Yu T, Riolo RL, Worzel B (eds.) Genetic programming theory and practice III. Genetic programming, Chap. 6, vol. 9. Springer, Ann Arbor, pp 79–92. doi:10.1007/0-387-28111-8_6

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael F. Korns .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Korns, M.F. (2016). Highly Accurate Symbolic Regression with Noisy Training Data. In: Riolo, R., Worzel, W., Kotanchek, M., Kordon, A. (eds) Genetic Programming Theory and Practice XIII. Genetic and Evolutionary Computation. Springer, Cham. https://doi.org/10.1007/978-3-319-34223-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-34223-8_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-34221-4

  • Online ISBN: 978-3-319-34223-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics