Abstract
This paper presents an algorithm for induction of ensembles of decision trees, also referred to as decision forests. In order to achieve high expressiveness the trees induced are multivariate, with various, possibly user-defined tests in their internal nodes. Strongly typed genetic programming is utilized to evolve structure of the tests. Special attention is given to the problem of diversity of the forest constructed. An approach is proposed, which explicitly encourages the induction algorithm to produce a different tree each run, which represents an alternative description of the data. It is shown that forests constructed this way have significantly reduced classification error even for small forest size, compared to other ensemble methods. Classification accuracy is also compared to other recent methods on several real-world datasets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36(1-2), 105–139 (1999)
Bentley, P.J.: Evolving fuzzy detectives: An investigation into the evolution of fuzzy rules. In: Late Breaking Papers at the 1999 Genetic and Evolutionary Computation Conference, Orlando, Florida, USA, pp. 38–47 (1999)
Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Brameier, M., Banzhaf, W.: A comparison of linear genetic programming and neural networks in medical data mining. IEEE Transactions on Evolutionary Computation 5(1), 17–26 (2001)
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Cantú-Paz, E., Kamath, C.: Inducing oblique decision trees with evolutionary algorithms. IEEE Transactions on Evolutionary Computing 7(1), 56–68 (2003)
De Falco, I., Della Cioppa, A., Tarantino, E.: Discovering interesting classification rules with genetic programming. Applied Soft Computing 1(4F), 257–269 (2001)
Eggermont, J.: Evolving fuzzy decision trees with genetic programming and clustering. In: Foster, J.A., Lutton, E., Miller, J., Ryan, C., Tettamanzi, A.G.B. (eds.) EuroGP 2002. LNCS, vol. 2278, pp. 71–82. Springer, Heidelberg (2002)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proc. 13th International Conference on Machine Learning, pp. 148–156. Morgan Kaufmann, San Francisco (1996)
Ho, T.K.: C4.5 decision forests. In: Proceedings of the 14th International Conference on Pattern Recognition, vol. 1, p. 545. IEEE Computer Society, Los Alamitos (1998)
Koza, J.R.: Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge (1992)
Kuncheva, L., Whitaker, C.: Measures of diversity in classifier ensembles (2000)
Marmelstein, R.E., Lamont, G.B.: Pattern classification using a hybrid genetic program decision tree approach. In: Genetic Programming 1998: Proceedings of the Third Annual Conference, University of Wisconsin, Madison, Wisconsin, USA, 22-25 July, pp. 223–231. Morgan Kaufmann, San Francisco (1998)
Montana, D.J.: Strongly typed genetic programming. BBN Technical Report #7866, Bolt Beranek and Newman, Inc., 10 Moulton Street, Cambridge, MA 02138, USA, 7 May (1993)
Murthy, S.K.: Automatic construction of decision trees from data: A multi-disciplinary survey. Data Mining and Knowledge Discovery 2(4), 345–389 (1998)
Quinlan, J.R.: C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Suchý, J., Kubalík, J. (2005). Inducing Diverse Decision Forests with Genetic Programming. In: Keijzer, M., Tettamanzi, A., Collet, P., van Hemert, J., Tomassini, M. (eds) Genetic Programming. EuroGP 2005. Lecture Notes in Computer Science, vol 3447. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-31989-4_27
Download citation
DOI: https://doi.org/10.1007/978-3-540-31989-4_27
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25436-2
Online ISBN: 978-3-540-31989-4
eBook Packages: Computer ScienceComputer Science (R0)