PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Covariance Matrix Self-Adaptation and Kernel Regression - Perspectives of Evolutionary Optimization in Kernel Machines

Autorzy
Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Kernel based techniques have shown outstanding success in data mining and machine learning in the recent past. Many optimization problems of kernel based methods suffer from multiple local optima. Evolution strategies have grown to successfulmethods in non-convex optimization. This work shows how both areas can profit from each other. We investigate the application of evolution strategies to Nadaraya-Watson based kernel regression and vice versa. The Nadaraya-Watson estimator is used as meta-model during optimization with the covariance matrix self-adaptation evolution strategy. An experimental analysis evaluates the meta-model assisted optimization process on a set of test functions and investigates model sizes and the balance between objective function evaluations on the real function and on the surrogate. In turn, evolution strategies can be used to optimize the embedded optimization problem of unsupervised kernel regression. The latter is fairly parameter dependent, and minimization of the data space reconstruction error is an optimization problem with numerous local optima. We propose an evolution strategy based unsupervised kernel regression method to solve the embedded learning problem. Furthermore, we tune the novel method by means of the parameter tuning technique sequential parameter optimization.
Słowa kluczowe
Wydawca
Rocznik
Strony
87--106
Opis fizyczny
Bibliogr. 48 poz., tab., wykr.
Twórcy
autor
  • Technische Universit¨at Dortmund, Department of Computer Science, Algorithm Engineering / Computational Intelligence (LS XI), Otto-Hahn-Str. 14, 44221 Dortmund, Germany, oliver.kramer@tu-dortmund.de
Bibliografia
  • [1] Bartz-Beielstein, T., Lasarczyk, C., Preuss,M.: Sequential Parameter Optimization, Proceedings of the IEEE Congress on Evolutionary Computation - CEC 2005, IEEE Press, 2005.
  • [2] Bäck, T., Schütz, M.: Intelligent Mutation Rate Control in Canonical Genetic Algorithms, Foundation of Intelligent Systems, 9th International Symposium, ISMIS '96, Springer, 1996, 158-167.
  • [3] Beyer, H.-G., Melkozerov, A.: σ-Self-Adaptive Weighted Multirecombination Evolution Strategy with Scaled Weights on the Noisy Sphere, Proceedings of the 10th Conference on Parallel Problem Solving from Nature - PPSN X, 2008.
  • [4] Beyer, H.-G., Schwefel, H.-P.: Evolution strategies - A Comprehensive Introduction, Natural Computing, 1(1), 2002, 3-52.
  • [5] Beyer, H.-G., Sendhoff, B.: Covariance Matrix Adaptation Revisited - The CMSA Evolution Strategy -, Proceedings of the 10th Conference on Parallel Problem Solving from Nature - PPSN X, 2008.
  • [6] Deb, K., Pratap, A., Agarwal, S.,Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Transactions on Evolutionary Computation, 6(2), 2002, 182-197.
  • [7] Eiben, A. E., Hinterding, R., Michalewicz, Z.: Parameter Control in Evolutionary Algorithms, IEEE Transactions on Evolutionary Computation, 3(2), 1999, 124-141.
  • [8] Emmerich, M., Giotis, A., ¨Ozdemir, M., Bäck, T., Giannakoglou, K.: Metamodel-Assisted Evolution Strategies, Proceedings of the 7th Conference on Parallel Problem Solving from Nature - PPSN VII, 2002.
  • [9] Fogel, D. B.: Artificial Intelligence through Simulated Evolution, Wiley, New York, 1966.
  • [10] Forrester, A., Keane, A.: Recent advances in surrogate-based optimization, Progress in Aerospace Sciences, January 2009.
  • [11] Friedrichs, F., Igel, C.: Evolutionary tuning of multiple SVM parameters, Neurocomputing, 64, 2005, 107-117.
  • [12] Giannakoglou,K., Giotis, A., Karakasis: Low-cost genetic optimization based on inexact pre-evaluations and the sensitivity analysis of design parameters, Inverse Problems in Engineering, 9, 2001, 389-412.
  • [13] Gieseke, F., Kramer, O.: Fast Evolutionary Maximum Margin Clustering, Proceedings of the International Conference on Machine Learning - ICML 2009, 2009.
  • [14] Hansen, N.: The CMA Evolution Strategy: A Tutorial, Technical report, TU Berlin, ETH Zürich, 2005.
  • [15] Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning, Springer, Berlin, 2009.
  • [16] Holland, J. H.: Adaptation in Natural and Artificial Systems, University ofMichigan Press, Ann Arbor, 1975.
  • [17] Jolliffe, I.: Principal component analysis, Springer series in statistics, Springer, New York, 1986.
  • [18] Kern, S., Hansen, N., Koumoutsakos, P.: Local Meta-models for Optimization Using Evolution Strategies, Proceedings of the 9th Conference on Parallel Problem Solving from Nature - PPSN IX, 2006.
  • [19] Klanke, S., Ritter, H.: Variants of unsupervised kernel regression: General cost functions, Neurocomputing, 70(7-9), 2007, 1289-1303.
  • [20] Kramer, O.: Self-Adaptive Heuristics for Evolutionary Computation, Springer, Berlin, 2008.
  • [21] Lawrence, N. D., Hyvärinen, A.: Probabilistic non-linear principal component analysis with Gaussian process latent variable models, Journal of Machine Learning Research, 6, 2005, 1783-1816.
  • [22] Lawrence, N. D.: Gaussian process latent variable models for visualisation of high dimensional data, Proceesdings of NIPS, 2004.
  • [23] Meinicke, P.: Unsupervised Learning in a Generalized Regression Framework, Ph.D. Thesis, University of Bielefeld, 2000.
  • [24] Meinicke, P., Klanke, S.,Memisevic, R., Ritter, H.: Principal Surfaces from Unsupervised Kernel Regression, IEEE Trans. Pattern Anal. Mach. Intell., 27(9), 2005, 1379-1391.
  • [25] Mersch, B., Glasmachers, T., Meinicke, P., Igel, C.: Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts, ICANN (2), 2006.
  • [26] Meyer-Nieberg, S., Beyer, H.-G.: Self-Adaptation in Evolutionary Algorithms, in: Parameter Setting in Evolutionary Algorithms (F. G. Lobo, C. F. Lima, Z. Michalewicz, Eds.), Springer, Berlin, 2007.
  • [27] Mierswa, I.: Evolutionary learning with kernels: A generic solution for large margin problems, 2006.
  • [28] Mierswa, I., Morik, K.: About the non-convex optimization problem induced by non-positive semidefinite kernel learning, Advances in Data Analysis and Classification, 2(3), December 2008, 241-258.
  • [29] Morell, O., Bernholt, T., Fried, R., Kunert, J., Nunkesser, R.: An Evolutionary Algorithm for LTSRegression: A Comparative Study, 2008.
  • [30] Nadaraya, E.A.: On estimating regression, Theory of Probability and Its Application, 10, 1964, 186-190.
  • [31] Ostermeier, A., Gawelczyk, A., Hansen, N.: A Derandomized Approach to Self Adaptation of Evolution Strategies, Evolutionary Computation, 2(4), 1994, 369-380.
  • [32] Rechenberg, I.: Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution, Frommann-Holzboog, Stuttgart, 1973.
  • [33] Riedmiller, M., Braun, H.: A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm, In Proceedings of the IEEE International Conference on Neural Networks, 1993.
  • [34] Rousseeuw, P. J., Leroy, A.M.: Robust regression and outlier detection, JohnWiley & Sons, Inc., New York, NY, USA, 1987.
  • [35] Roweis, S. T., Saul, L. K.: Nonlinear dimensionality reduction by locally linear embedding, SCIENCE, 290, 2000, 2323-2326.
  • [36] Sch¨olkopf, B., Smola, A. J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press, Cambridge,MA, USA, 2001.
  • [37] Schwefel, H.-P.: Evolutionsstrategie und numerische Optimierung, Ph.D. Thesis, TU Berlin, 1975.
  • [38] Schwefel, H.-P.: Numerische Optimierung von Computer-Modellen mittel der Evolutionsstrategie, Birkhäuser, Basel, 1977.
  • [39] Schölkopf, B., Smola, A., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem, Neural Computation, 10(5), 1998, 1299-1319.
  • [40] Steffen, J., Pardowitz, M., Ritter, H.: A Manifold Representation as Common Basis for Action Production and Recognition, KI, 2009.
  • [41] Stoean, R., Dumitrescu, D., Preuss, M., Stoean, C.: Evolutionary Support Vector Regression Machines, SYNASC, 2006.
  • [42] Stoean, R., Preuss, M., Stoean, C., Dumitrescu, D.: Concerning the potential of evolutionary support vector machines, IEEE Congress on Evolutionary Computation CEC, 2007.
  • [43] Stoean, R., Preuss, M., Stoean, C., El-Darzi, E., Dumitrescu, D.: An Evolutionary Approximation for the Coefficients of Decision Functions within a Support Vector Machine Learning Strategy, in: Foundations of Computational, Intelligence Volume 1 (A. E. Hassanien, A. Abraham, Eds.), Springer, 2009, 315-347.
  • [44] Stoean, R., Stoean, M. P. C., E-Darzi, E., Dumitrescu, D.: Support vector machine learning with an evolutionary engine, Journal of the Operational Research Society, 60(8), 2009, 1116-1122.
  • [45] Suykens, J. A. K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers, Neural Processing Letters, 9(3), 1999, 293-300.
  • [46] Ulmer, H., Streichert, F., Zell, A.: Optimization by Gaussian Processes assisted Evolution Strategies, Springer Verlag, Heidelberg, Germany, 3-5 September 2003.
  • [47] Watson, G.S.: Smooth regression analysis, Sankhya Series A, 26, 1964, 359-372.
  • [48] Zhang, K., Tsang, I. W., Kwok, J. T.: Maximum margin clustering made practical, Proceedings of the 24th International Conference on Machine Learning, ACM, New York, NY, USA, 2007.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BUS8-0010-0006
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.