Powiadomienia systemowe
- Sesja wygasła!
Tytuł artykułu
Autorzy
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Multiple classifier fusion may generate more accurate classification than each of the constituent classifiers. The aim was to examine the ensemble performance by the comparison of boosting, bagging and fixed fusion methods for aiding diagnosis. Real-life medical data set for thyroid diseases recognition was applied. Different fixed combined classifiers (mean, average, product, minimum, maximum, and majority vote) built on parametric and nonparametric Bayesian discriminant methods have been employed. No very significant improvement of recognition rates by a fixed classifier combination was achieved on the examined data. The best performance was obtained for resampling methods with classification trees, for both the bagging and the boosting combining methods. The bagging and the boosting logistic regression methods have proven less efficient than the bagging or the boosting of neural networks. Difference between the bagging and the boosting performance for the examined data set was not obtained.
Wydawca
Czasopismo
Rocznik
Tom
Strony
17--31
Opis fizyczny
Bibliogr. 11 poz., tab., wykr.
Twórcy
autor
- Departament Theoretical Foundations of Biomedical Sciences, Collegium Medicum, Nicolaus Copernicus University, ul. Jagiellońska 13, 85-067 Bydgoszcz, Poland, mjurkowska@cm.umk.pl
Bibliografia
- [1] Webb A. R.: Statistical pattern recognition, John Wiley & Sons, Ltd, Chichester 2003. DOI: 10.1002/0470854774.ch1.
- [2] Duin R. P. W., Tax D. M. J.: Experiments with Classifier Combining Rules; in: Multiple Classifier Systems. Kittler J., Roli F. (Eds.), Berlin 2000, Springer-Verlag.
- [3] Yu K. Jiang X. Bunke H.: Lipreading: A classifier combination approach. Pattern Recognition Letters 1997, 18, 1421-1426.
- [4] Breiman L.: Bagging predictions. Machine Learning 1996, 24 (2), 123-140.
- [5] Skurichina M., Duin R.: Boosting in Linear Discriminant Analysis; in: Multiple Classifier Systems. Lecture Notes in Computer Science 2000, 190-199, DOI: 10.1007/3-540-45014-9_18.
- [6] Freund Y., Schapire R.: Experiments with a new boosting algorithm. In machine learning. Proc.13th Intern. Conf., Morgan Kaufmann, San Francisco 1996, 148-156.
- [7] Skurichina M., Kuncheva L., Duin R.: Bagging and boosting for the nearest mean classifier: Effects of sample size on diversity and accuracy; in: Multiple Classifier Systems. Lecture Notes in Computer Science 2002, 2364, 307-311. DOI: 10.1007/3-540-45428-4_6.
- [8] Kuncheva L. I., Whitaker Ch.: Using Diversity with Three Variants of Boosting: Aggressive, Conservative and Inverse. In: Multiple Classifier Systems. Lecture Notes in Computer Science, 2002, 2364, 717-720.
- [9] Shipp C. A., Kuncheva L. I.: Relationships between combination methods and measures of diversity in combining classifiers. Information Fusion 2002, 3 (2), 135-148.
- [10] Ćwiklińska-Jurkowska M., Jurkowski P.: Effectiveness in Ensemble of Classifiers and their Diversity on Big Medical Data Set. Computational Statistics. COMPSTAT 2004, 855-862.
- [11] Duda R., Hart P., Stork D.: Pattern classification. Wiley, New York 2001.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BPZ6-0002-0019
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.