Skip to main content

Improving Genetic Programming for Classification with Lazy Evaluation and Dynamic Weighting

  • Conference paper
  • First Online:
Book cover Computational Intelligence (IJCCI 2017)

Part of the book series: Studies in Computational Intelligence ((SCI,volume 829))

Included in the following conference series:

  • 265 Accesses

Abstract

In the standard process of creating classification decision trees with genetic programming, the evaluation process it the most time-consuming part of the whole evolution loop. Here we introduce a lazy evaluation approach of classification decision trees in the evolution process, that does not evaluate the whole population but evaluates only the individuals that are chosen to participate in the tournament selection method. Further on, we used dynamic weights for the classification instances, that are linked to the chance of that instance getting picked for the evaluation process and are determined by that instance’s classification rate. These instance weights change based on the misclassification rate of the instance. We thoroughly describe and experiment with the lazy evaluation on standard classification benchmark datasets and show that not only lazy evaluation approach uses less time to evolve the good solution, but can even produce statistically better solution due to changing instance weights and thus preventing the overfitting of the solutions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Espejo, P.G., Ventura, S., Herrera, F.: A survey on the application of genetic programming to classification. IEEE Trans. Syst. Man Cybern. Part C: Appl. Rev. 40, 121–144 (2010)

    Article  Google Scholar 

  2. Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC Press, Boca Raton (1984)

    MATH  Google Scholar 

  3. Quinlan, J.R.: C4. 5: Programs for Machine Learning. Elsevier, Amsterdam (2014)

    Google Scholar 

  4. Cheng, J., Fayyad, U.M., Irani, K.B., Qian, Z.: Improved decision trees: a generalized version of id3. In: Proceedings of the Fifth International Conference on Machine Learning, pp. 100–107 (1988)

    Google Scholar 

  5. Liaw, A., Wiener, M., et al.: Classification and regression by randomforest. R News 2, 18–22 (2002)

    Google Scholar 

  6. Ganjisaffar, Y., Caruana, R., Lopes, C.V.: Bagging gradient-boosted trees for high precision, low variance ranking models. In: Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 85–94. ACM (2011)

    Google Scholar 

  7. Zhang, B.T., Cho, D.Y.: Genetic programming with active data selection. In: Asia-Pacific Conference on Simulated Evolution and Learning, pp. 146–153. Springer (1998)

    Google Scholar 

  8. Podgorelec, V., Zorman, M.: Decision tree learning. In: Encyclopedia of Complexity and Systems Science, pp. 1–28. Springer (2015)

    Google Scholar 

  9. Podgorelec, V., Šprogar, M., Pohorec, S.: Evolutionary design of decision trees. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 3, 63–82 (2013)

    Article  Google Scholar 

  10. Karakatič, S., Heričko, M., Podgorelec, V.: Experiments with lazy evaluation of classification decision trees made with genetic programming. In: Proceedings of the 9th International Joint Conference on Computational Intelligence - Volume 1: IJCCI, INSTICC, SciTePress, pp. 348–353 (2017)

    Google Scholar 

  11. Gathercole, C., Ross, P.: Dynamic training subset selection for supervised learning in genetic programming. Parallel Probl. Solving Nat. PPSN III 312–321 (1994)

    Google Scholar 

  12. Šprogar, M.: Excluding fitness helps improve robustness of evolutionary algorithms. In: Knowledge-Based Intelligent Information and Engineering Systems, pp. 905–905. Springer (2005)

    Google Scholar 

  13. Lichman, M.: UCI machine learning repository (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sašo Karakatič .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Karakatič, S., Heričko, M., Podgorelec, V. (2019). Improving Genetic Programming for Classification with Lazy Evaluation and Dynamic Weighting. In: Sabourin, C., Merelo, J.J., Madani, K., Warwick, K. (eds) Computational Intelligence. IJCCI 2017. Studies in Computational Intelligence, vol 829. Springer, Cham. https://doi.org/10.1007/978-3-030-16469-0_4

Download citation

Publish with us

Policies and ethics