An Ensemble of Optimal Trees for Classification and Regression (OTE)

On assumptions in optimisation of warranty policies.
22 July, 2016
A global optimisation approach to range-restricted survey calibration
30 September, 2016

An Ensemble of Optimal Trees for Classification and Regression (OTE)

Working Paper

An Ensemble of Optimal Trees for Classification and Regression (OTE)

  • Authors: Adlere, A,. Gul, A,. Khana, Z,. Lausen, B,. Mahmoud, O,. Miftahuddin, M,. and Perperoglou, A,.
Abstract:
Predictive performance of a random forest ensemble is highly associated with the strength of individual trees and their diversity. Ensemble of a small number of accurate and diverse trees, if prediction accuracy is not compromised, will also reduce computational burden. We investigate the idea of integrating trees that are accurate and diverse. For this purpose, we utilize out-of-bag observation as validation sample from the training bootstrap samples to choose the best trees based on their individual performance and then assess these trees for diversity using Brier score. Starting from the first best tree, a tree is selected for the final ensemble if its addition to the forest reduces error of the trees that have already been added. A total of 35 bench mark problems on classification and regression are used to assess the performance of the proposed method and compare it with kNN, tree, random forest, node harvest and support vector machine. We compute unexplained variances and classification error rates for all the methods on the corresponding data sets. Our experiments reveal that the size of the ensemble is reduced significantly and better results are obtained in most of the cases. For further verification, a simulation study is also given where four tree style scenarios are considered to generate data sets with several structures.