Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Meta-learning is becoming more and more important in current and future research concentrated around broadly defined data mining or computational intelligence. It can solve problems that cannot be solved by any single, specialized algorithm. The overall characteristic of each meta-learning algorithm mainly depends on two elements: the learning machine space and the supervisory procedure. The former restricts the space of all possible learning machines to a subspace to be browsed by a meta-learning algorithm. The latter determines the order of selected learning machines with a module responsible for machine complexity evaluation, organizes tests and performs analysis of results. In this article we present a framework for meta-learning search that can be seen as a method of sophisticated description and evaluation of functional search spaces of learning machine configurations used in meta-learning. Machine spaces will be defined by specially defined graphs where vertices are specialized machine configuration generators. By using such graphs the learning machine space may be modeled in a much more flexible way, depending on the characteristics of the problem considered and a priori knowledge. The presented method of search space description is used together with an advanced algorithm which orders test tasks according to their complexities.
Rocznik
Tom
Strony
647--667
Opis fizyczny
Bibliogr. 32 poz., rys., tab., wykr.
Twórcy
autor
- Department of Informatics, Nicolaus Copernicus University, ul. Grudziądzka 5, 87-100 Toruń, Poland, norbert@is.umk.pl
Bibliografia
- [1] Bensusan, H., Giraud-Carrier, C. and Kennedy, C. J. (2000). A higher-order approach to meta-learning, in J. Cussens and A. Frisch (Eds.), Proceedings of the Work-in-Progress Track at the 10th International Conference on Inductive Logic Programming, Springer-Verlag, Berlin/Heidelberg, pp. 33-42.
- [2] Brazdil, P., Giraud-Carrier, C., Soares, C. and Vilalta, R. (2009). Metalearning: Applications to Data Mining, Springer, Berlin/Heidelberg.
- [3] Brazdil, P., Soares, C. and da Costa, J. P. (2003). Ranking learning algorithms: Using IBL and meta-learning on accuracy and time results, Machine Learning 50(3): 251-277.
- [4] Chan, P. and Stolfo, S. J. (1996). On the accuracy of metalearning for scalable data mining, Journal of Intelligent Information Systems 8(1): 5-28.
- [5] Czarnowski, I. and Jędrzejowicz, P. (2011). Application of agent-based simulated annealing and tabu search procedures to solving the data reduction problem, International Journal of Applied Mathematics and Computer Science 21(1): 57-68, DOI: 10.2478/v10006-011-0004-3.
- [6] Duch, W. and Grudziński, K. (1999). Search and global minimization in similarity-based methods, International Joint Conference on Neural Networks, Washington, DC, USA, p. 742.
- [7] Duch, W. and Itert, L. (2003). Committees of undemocratic competent models, Proceedings of the Joint International Conference on Artificial Neural Networks (ICANN) and the International Conference on Neural Information Processing (ICONIP), Istanbul, Turkey, pp. 33-36.
- [8] Duch, W., Wieczorek, T., Biesiada, J. and Blachnik, M. (2004). Comparison of feature ranking methods based on information entropy, Proceedings of International Joint Conference on Neural Networks, Budapest, Hungary, pp. 1415-1420.
- [9] Frank, A. and Asuncion, A. (2010). UCI machine learning repository, University of California, School of Information and Computer Science, Irvine, CA, http://archive.ics.uci.edu/ml.
- [10] Grąbczewski, K. and Jankowski, N. (2011). Saving time and memory in computational intelligence system with machine unification and task spooling, Knowledge-Based Systems 24(5): 570-588.
- [11] Guyon, I. (2003). NIPS 2003 workshop on feature extraction, http://www.clopinet.com/isabelle/Projects/NIPS2003.
- [12] Guyon, I. (2006). Performance prediction challenge, http://www.modelselect.inf.ethz.ch.
- [13] Guyon, I., Gunn, S., Nikravesh, M. and Zadeh, L. (Eds.) (2006). Feature Extraction: Foundations and Applications, Springer, Berlin/Heidelberg.
- [14] Jankowski, N., Duch, W. and Grąbczewski, K. (Eds.) (2011). Meta-learning in Computational Intelligence, Studies in Computational Intelligence, Vol. 358, Springer, Berlin/Heidelberg.
- [15] Jankowski, N. and Grąbczewski, K. (2005). Heterogenous committees with competence analysis, in N. Nedjah, L. Mourelle, M. Vellasco, A. Abraham and M. Köppen (Eds.), 5th International Conference on Hybrid Intelligent Systems, Rio de Janeiro, Brazil, IEEE Press, New York, NY, pp. 417-422.
- [16] Jankowski, N. and Grąbczewski, K. (2007). Handwritten digit recognition-Road to contest victory, IEEE Symposium Series on Computational Intelligence, IEEE Press, New York, NY, pp. 491-498.
- [17] Jankowski, N. and Grochowski, M. (2004). Comparison of instances selection algorithms I: Algorithms survey, in L. Rutkowski, I. Siekmann, R. Tadeusiewicz and L.A. Zadeh (Eds.), Artificial Intelligence and Soft Computing, Lecture Notes in Artifical Intelligence, Vol. 3070, Springer-Verlag, Berlin/Heidelberg pp. 598-603.
- [18] Jankowski, N. and Grochowski, M. (2005). Instances selection algorithms in the conjunction with LVQ, in M.H. Hamza (Ed.), Artificial Intelligence and Applications, ACTA Press, Innsbruck, pp. 453-459.
- [19] Kadlec, P. and Gabrys, B. (2008). Learnt topology gating artificial neural networks, IEEE World Congress on Computational Intelligence, Hong Kong, China, pp. 2605-2612.
- [20] Kohonen, T. (1986). Learning vector quantization for pattern recognition, Technical Report TKK-F-A601, Helsinki University of Technology, Espoo.
- [21] Kordík, P. and Černý, J. (2011). Self-organization of supervised models, in N. Jankowski, W. Duch and K. Grąbczewski (Eds.), Meta-learning in Computational Intelligence, Studies in Computational Intelligence, Vol. 358, Springer, Berlin/Heidelberg, pp. 179-223.
- [22] Korytkowski, M., Nowicki, R., Rutkowski, L. and Scherer, R. (2011). AdaBoost ensemble of DCOG rough-neuro-fuzzy systems, in P. Jędrzejowicz, N. T. Nguyen and K. Hoang (Eds.), ICCCI (1), Lecture Notes in Computer Science, Vol. 6922, Springer, Berlin/Heidelberg, pp. 62-71.
- [23] Łęski, J. (2003). A fuzzy if-then rule-based nonlinear classifier, International Journal of Applied Mathematics and Computer Science 13(2): 215-223.
- [24] Peng, Y., Falch, P., Soares, C. and Brazdil, P. (2002). Improved dataset characterisation for meta-learning, 5th International Conference on Discovery Science, Luebeck, Germany, pp. 141-152.
- [25] Pfahringer, B., Bensusan, H. and Giraud-Carrier, C. (2000). Meta-learning by landmarking various learning algorithms, International Conference on Machine Learning, Stanford, CA, USA, pp. 743-750.
- [26] Prodromidis, A. and Chan, P. (2000). Meta-learning in distributed data mining systems: Issues and approaches, in H. Kargupta and P. Chan (Eds.), Book on Advances of Distributed Data Mining, AAAI Press, Menlo Park, CA.
- [27] Scherer, R. (2010). Designing boosting ensemble of relational fuzzy systems, International Journal of Neural Systems 20(5): 381-388.
- [28] Scherer, R. (2011). An ensemble of logical-type neuro-fuzzy systems, Expert Systems with Applications 38(10): 13115-13120.
- [29] Smith-Miles, K.A. (2008). Towards insightful algorithm selection for optimization using meta-learning concepts, IEEE World Congress on Computational Intelligence, Hong Kong, China, pp. 4117-4123.
- [30] Todorovski, L. and Dzeroski, S. (2003). Combining classifiers with meta decision trees, Machine Learning Journal 50(3): 223-249.
- [31] Troć, M. and Unold, O. (2010). Self-adaptation of parameters in a learning classifier system ensemble machine, International Journal of Applied Mathematics and Computer Science 20(1): 157-174, DOI: 10.2478/v10006-010-0012-8.
- [32] Witten, I. H. and Frank, E. (2005). Data Mining: Practical Machine Learning Tools and Techniques, Morgan Kaufmann, Amsterdam.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BPZ7-0007-0012