PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In most approaches to ensemble methods, base classifiers are decision trees or decision stumps. In this paper, we consider an algorithm that generates an ensemble of decision rules that are simple classifiers in the form of logical expression: if [conditions], then [decision]. Single decision rule indicates only one of the decision classes. If an object satisfies conditions of the rule, then it is assigned to that class. Otherwise the object remains unassigned. Decision rules were common in the early machine learning approaches. The most popular decision rule induction algorithms were based on sequential covering procedure. The algorithm presented here follows a different approach to decision rule generation. It treats a single rule as a subsidiary, base classifier in the ensemble. First experimental results have shown that the presented algorithm is competitive with other methods. Additionally, generated decision rules are easy in interpretation, which is not the case of other types of base classifiers.
Rocznik
Strony
221--232
Opis fizyczny
Bibliogr. 22 poz.
Twórcy
autor
  • Institute of Computing Science, Poznan University of Technology
Bibliografia
  • [1] Michalski. R. S.: A Theory and Methodology of Inductive Learning. In Michalski, R. S., Carbonell, J. G.. Mitchell, T. M. (eds.): Machine Learning: An Artificial Intelligence Approach. Pało Alto, Tioga Publishing, (1983) 83-129
  • [2] Booker, L. B.. Goldberg, D. E., Holland, J. F.: Classifier systems and genetic algorithms. In Carbonell. J. G. (ed.): Machine Learning. Paradigms and Methods. The MIT Press, Cambridge, -MA (1990) 235-282
  • [3] Boros. E.. Hammer, P. L., Ibaraki, T., Kogan, A.. Mayoraz, E.. Muclmik, L: An Implementation of Logical Analysis of Data. IEEE Trans, on Knowledge and Data Engineering 12 (2000) 292-306
  • [4] Breiman, L.: Bagging Predictors. Machine Learning 24 2 (1996) 123-140 [5] Breiman. L.: Random Forests. Machine Learning 45 1 (2001) 5-32
  • [6] Breiman. L. Friedman, J. H., Olshen. R. A., Stone. C. J.: Classification and Regression Trees. Wadsworth, (1984)
  • [7] Clark, P.. Xibbet, T.: The CN2 induction algorithm. Machine Learning 3 (1989) 261-283
  • [8] Cohen. W. W., Singer, Y: A simple, fast, and effective rule learner. Proc. of ' 16th National Conference on Artificial Intelligence (1999) 335-342
  • [9] Friedman, J. H., Hastie. T. and Tibshirani, R,: Additive logistic regression: a statistical view of boosting. Dept. of Statistics, Stanford University Technical Report. http://www-stat. Stanford. edu/"jhf/ (last access: 1.05.2006), August (1998)
  • [10] Friedman. J. H., Popescu, B. E.: Importance Sampled Learning Ensembles. Dept. of Statistics. Stanford University Technical R.eport, http://www-stat .Stanford.edu/"jhf/ (last access: 1.05.2006), September (2003)
  • [11] Friedman, J. H., Hastie, T., Tibshirani. R.: Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer (2003)
  • [12] Friedman, J. H.: Recent advances in predictive (machine) learning, http://ww-stat. Stanford, edu/"jhf/ (last access: 1.05.2006). November (2003)
  • [13| Friedman. J. H., Popescu, B. E.: Gradient directed regularization. Stanford University Technical Report, http://www-stat.stanford.edu/~jhf/ (last access: 1.05.2006), February (2004)
  • [14] Friedman. J. H.. Popescu. B. E.: Predictive Learning via Rule Ensembles. Dept. of Statistics, Stanford University Technical Report, http: //www-stat. Stanford. edu/~jhf / (last access: 1.05.2006), February (2005)
  • [15] Grzymala-Busse. J. W.: LERS — A system for learning from examples based on rough sets. In Słowiński, R. (ed.): Intelligent Decision Support, Handbook of Applications and Advances of the Rough Sets Theory. Kluwer Academic Publishers (1992) 3-18
  • [16] Newman, D. J., Hettich: S., Blake. C. L., Merz, C. J. (UCI) Repository of machine learning databases, http://www. ics . uci . edu/~mlearn/MLRepository. html (last access: 01.05.2006), Dept. of Information and Computer Sciences. University of California. Irvine (1998)
  • [17] Pawlak. Z.: Rough Sets. Theoretical Aspects of Reasoning about Data. Kluwer Academic Publishers, Dordrecht. (1991)
  • [18] Quinlan. J. R.: C4.5: Programs for Machine Learning. Morgan Kaufmann (1993)
  • [19] Schapire. R. E., Freurid. Y-, Bartlett, P., Lee, W, E.: Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics 26 5 (1998) 1651-1686
  • [20] Skowron, A.: Extracting laws from decision tables - a rough set approach. Computational Intelligence 11 371-388
  • [21] Stefanowki, J.: On rough set based approach to induction of decision rules. In Skowron. A. and Polkowski L. (eds): Rough Set in Knowledge Discovering, Physica Verlag, Heidelberg (1998) 500-529
  • [22] Witten. I. H.; Frank. E.: Data Mining: Practical machine learning tools and techniques. 2nd Edition. Morgan Kaufmann. San Francisco (2005)
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BPP1-0069-0089
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.