PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

A Novel Ensemble Model - The Random Granular Reflections

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
One of the most popular families of techniques to boost classification are Ensemble methods. Random Forests, Bagging and Boosting are the most popular and widely used ones. This article presents a novel Ensemble Model, named Random Granular Reflections. The algorithm used in this new approach creates an ensemble of homogeneous granular decision systems. The first step of the learning process is to take the training system and cover it with random homogeneous granules (groups of objects from the same decision class that are as little indiscernible from each other as possible). Next, granular reflection is created, which is finally used in the classification process. Results obtained by our initial experiments show that this approach is promising and comparable with other tested methods. The main advantage of our new method is that it is not necessary to search for optimal parameters while looking for granular reflections in the subsequent iterations of our ensemble model.
Wydawca
Rocznik
Strony
183--203
Opis fizyczny
Bibliogr. 32 poz., rys., tab., wykr.
Twórcy
  • Faculty of Mathematics and Computer Science, University of Warmia and Mazury, Olsztyn, Poland
  • Faculty of Mathematics and Computer Science, University of Warmia and Mazury, Olsztyn, Poland
Bibliografia
  • [1] Polkowski L. A model of granular computing with applications. granules from rough inclusions in information systems. In: 2006 IEEE International Conference on Granular Computing GrC06, Atlanta, USA. IEEE Press pp. 9-16. doi:10.1109/GRC.2006.1635745.
  • [2] Artiemjew P. Classifiers from Granulated Data Sets: Concept Dependent and Layered Granulation, 2007. doi:10.1.1.98.7145. URL http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.98.7145.
  • [3] Artiemjew P. A Review of the Knowledge Granulation Methods: Discrete vs. Continuous Algorithms, pp. 41-59. Springer Berlin Heidelberg, Berlin, Heidelberg. ISBN 978-3-642-30341-8, 2013. doi:10.1007/978-3-642-30341-8_4. URL https://doi.org/10.1007/978-3-642-30341-8_4.
  • [4] Polkowski L. Formal granular calculi based on rough inclusions. In: 2005 IEEE International Conference on Granular Computing GrC05, Beijing, China. IEEE Press, volume 1. 2005 pp. 57-69. doi:10.1109/GRC.2005.1547235.
  • [5] Polkowski L. Approximate Reasoning by Parts: An Introduction to Rough Mereology. Intelligent Systems Reference Library. Springer Berlin Heidelberg, 2011. ISBN 9783642222795. doi:10.1007/978-3-642-22279-5. URL https://books.google.pl/books?id=QokncKjRs\_cC.
  • [6] Polkowski AP L. Granular Computing in Decision Approximation - An Application of Rough Mereology, volume 77 of Intelligent Systems Reference Library. Springer Berlin Heidelberg, 2015. ISBN 978-3-319-12879-5. doi:10.1007/978-3-319-12880-1.
  • [7] Ropiak K, Artiemjew P. On Granular Rough Computing: Epsilon Homogenous Granulation. In: Nguyen HS, Ha QT, Li T, Przybyła-Kasperek M (eds.), Rough Sets. Springer International Publishing, Cham. ISBN 978-3-319-99368-3, 2018 pp. 546-558. doi:10.1007/978-3-319-99368-3_43.
  • [8] Ropiak K, Artiemjew P. A Study in Granular Computing: Homogenous Granulation, pp. 336-346. 2018. doi:10.1007/978-3-319-99972-2_27. URL https://app.dimensions.ai/details/publication/pub.1106390407.
  • [9] Xiaohua Hu. Ensembles of classifiers based on rough sets theory and set-oriented database operations. In: 2006 IEEE International Conference on Granular Computing. 2006 pp. 67-73. doi:10.1109/GRC.2006.1635760.
  • [10] Murthy CA, Saha S, Pal SK. Rough Set Based Ensemble Classifier. In: Kuznetsov SO, Slezak D, Hepting DH, Mirkin BG (eds.), Rough Sets, Fuzzy Sets, Data Mining and Granular Computing. Springer Berlin Heidelberg, Berlin, Heidelberg. ISBN 978-3-642-21881-1, 2011 pp. 27-27. doi:10.1007/978-3-642-21881-1_5.
  • [11] Saha S, Murthy C, Pal S. Rough set Based Ensemble Classifier for Web Page Classification. Fundamenta Informaticae, 2007. Vol. 76, nr 1-2:171-187. URL bwmeta1.element.baztech-article-BUS5-0009-0040.
  • [12] Shi L,Weng M, Ma X, Xi L. Rough Set Based Decision Tree Ensemble Algorithm for Text Classification, 2010. URL http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.588.4078&rep=rep1&type=pdf.
  • [13] Artiemjew P. Boosting Effect for Classifier Based on Simple Granules of Knowledge ITC 2 / 47. Volume 47 No2. Information technology and control, 2018 doi:10.5755/j01.itc.47.2.19675. URL https://doi.org/10.5755/j01.itc.47.2.19675.
  • [14] Polkowski L. Rough Sets: Mathematical Foundations. Physica-Verlag, 2002. ISBN 3790815101. doi:10.1007/978-3-7908-1776-8.
  • [15] Polkowski L. A Rough Set Paradigm for Unifying Rough Set Theory and Fuzzy Set Theory. In: Wang G, Liu Q, Yao Y, Skowron A (eds.), Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing. Springer Berlin Heidelberg, Berlin, Heidelberg. ISBN 978-3-540-39205-7, 2003 pp. 70-77. doi:10.1007/3-540-39205-X_9.
  • [16] Polkowski L. Toward Rough Set Foundations. Mereological Approach. In: Tsumoto S, Slowinski R, Komorowski J, Grzymala-Busse JW (eds.), Rough Sets and Current Trends in Computing. Springer Berlin Heidelberg, Berlin, Heidelberg. ISBN 978-3-540-25929-9, 2004 pp. 8-25. doi:10.1007/978-3-540-25929-9_2.
  • [17] Polkowski L. Granulation of Knowledge in Decision Systems: The Approach Based on Rough Inclusions. The Method and Its Applications. In: Kryszkiewicz M, Peters JF, Rybinski H, Skowron A (eds.), Rough Sets and Intelligent Systems Paradigms. Springer Berlin Heidelberg, Berlin, Heidelberg. ISBN 978-3-540-73451-2, 2007 pp. 69-79. doi:10.1007/978-3-540-73451-2_9.
  • [18] Polkowski L. The Paradigm of Granular Rough Computing: Foundations and Applications. In: 6th IEEE International Conference on Cognitive Informatics Lake Tahoe NV. IEEE Computer Society, Los Alamitos CA. 2007 pp. 154-162. doi:10.1109/COGINF.2007.4341886.
  • [19] Polkowski L. A Unified Approach to Granulation of Knowledge and Granular Computing Based on Rough Mereology: A Survey. In: In: Handbook of Granular Computing. ohn Wiley & Sons, 2008 pp. 375-401. doi:10.1002/9780470724163.ch16.
  • [20] Polkowski L. Granulation of Knowledge: Similarity Based Approach in Information and Decision Systems, pp. 1464-1487. Springer New York, New York, NY. ISBN 978-1-4614-1800-9, 2012. doi:10.1007/978-1-4614-1800-9_94. URL https://doi.org/10.1007/978-1-4614-1800-9_94.
  • [21] Quinlan JR. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1993. ISBN 1558602380. doi:10.5555/152181.
  • [22] Zhou ZH. Ensemble Methods: Foundations and Algorithms. Chapman & Hall/CRC, 1st edition, 2012. ISBN 9781439830031. doi:10.5555/2381019.
  • [23] Tin Kam Ho. Random decision forests. In: Proceedings of 3rd International Conference on Document Analysis and Recognition, volume 1. 1995 pp. 278-282 vol.1. doi:10.1109/ICDAR.1995.598994.
  • [24] Yang P. A Review of Ensemble Methods in Bioinformatics. Current Bioinformatics, 2010. 5(4):296-308. doi:10.2174/157489310794072508. URL http://www.eurekaselect.com/node/87255/article.
  • [25] Breiman L. Arcing classifier (with discussion and a rejoinder by the author). Annals of Statistics, 1998. 26(3):801-849. doi:10.1214/aos/1024691079. URL https://doi.org/10.1214/aos/1024691079.
  • [26] Freund Y, Schapire RE. A Short Introduction to Boosting. In: In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence. Morgan Kaufmann, 1999 pp. 1401-1406.
  • [27] Ohno-Machado L. Cross-validation and Bootstrap Ensembles, Bagging, Boosting. Harvard-MIT Division of Health Sciences and Technology, 2005. URL http://ocw.mit.edu/courses/health-sciences-and-technology/hst-951j-medical-decision-support-fall-2005/lecture-notes/hst951\_6.pdf.
  • [28] Schapire RE. The Boosting Approach to Machine Learning: An Overview. In MSRI Workshop on Nonlinear Estimation and Classification, Berkeley, CA, USA, 2001.
  • [29] Zhou ZH. Boosting 25 years. CCL 2014 Keynote (2014), 2014. URL https://www.slideshare.net/hustwj/ccl2014-keynote.
  • [30] Dua D, Graff C. UCI Machine Learning Repository, 2017. URL http://archive.ics.uci.edu/ml.
  • [31] He H, Ma Y. Imbalanced Learning: Foundations, Algorithms, and Applications. Wiley-IEEE Press, 1st edition, 2013. ISBN 1118074629. doi:10.5555/2559492.
  • [32] Fernndez A, Garca S, Galar M, Prati RC, Krawczyk B, Herrera F. Learning from Imbalanced Data Sets. Springer, 2018. ISBN 978-3-319-98073-7.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-3656d145-62a4-4269-9a6c-f7c030c44f57
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.