PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Supervised Kernel Principal Component Analysis by Most Expressive Feature Reordering

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The presented paper is concerned with feature space derivation through feature selection. The selection is performed on results of kernel Principal Component Analysis (kPCA) of input data samples. Several criteria that drive feature selection process are introduced and their performance is assessed and compared against the reference approach, which is a combination of kPCA and most expressive feature reordering based on the Fisher linear discriminant criterion. It has been shown that some of the proposed modifications result in generating feature spaces with noticeably better (at the level of approximately 4%) class discrimination properties.
Rocznik
Tom
Strony
3--10
Opis fizyczny
Bibliogr. 17 poz., rys., tab.
Twórcy
autor
  • Institute of Applied Computer Science, Lodz University of Technology, Lodz, Poland
autor
  • Institute of Applied Computer Science, Lodz University of Technology, Lodz, Poland
autor
  • Institute of Applied Computer Science, Lodz University of Technology, Lodz, Poland
autor
  • Institute of Applied Computer Science, Lodz University of Technology, Lodz, Poland
Bibliografia
  • [1] T. Hofmann, B. Scholkopf, and A. Smola, “Kernel methods in machine learning”, The Annals of Statistics, vol. 36, no. 3, pp. 1171–1220, 2008.
  • [2] B. Scholkopf and A. Smola, Learning with Kernels. Cambridge: MIT Press, MA, 2002.
  • [3] S. Roweis and L. Saul, “Nonlinear dimensionality reduction by locally linear embedding”, Science, vol. 290, pp. 2323–2326, 2000.
  • [4] M. Belkin and P. Niyogi. “Laplacian eigenmaps for dimensionality reduction and data representation’, Neural Comput., vol. 15, no. 6. pp. 1373–1396, 2003
  • [5] J. Tenenbaum, V. de Silva, and J. Langford, “A global geometric framework for non-linear dimensionality reduction”, Science, vol. 290, pp. 2319–2323, 2000.
  • [6] B. Scholkopf and A. Smola, “Nonlinear Component Analysis as a Kernel Eigenvalue Problem”, Neural Comput., vol. 10, pp. 1299–1319, 1998
  • [7] S. Mika, G. Ratsch, J. Weston, B. Scholkoph, and K. Mullers, “Fisher discriminant analysis with kernels”, in Proc. IEEE Conf. of Neural Netw. for Sig. Process., Madison, WI, USA, 1999, pp. 41–48.
  • [8] E. Barshan, A. Ghodsi, Z. Azimifar, and M. Z. Jahromi “Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds”, Pattern Recogn., vol. 44, pp. 1357–1371, 2011.
  • [9] C. Burges, “A tutorial on support vector machines for pattern recognition”, Data Mining and Knowl. Discovery, vol. 2, pp. 121–167, 1998.
  • [10] M. Wang, S. Fei, and M. I. Jordan, “Unsupervised kernel dimension reduction”, in Proc. Conf. Adv. in Neural Inform. Process. Systems NIPS 2010, Vancouver, BC, Canada, 21010, vol. 23, pp. 2379–2387.
  • [11] Le Song, A. Smola, A. Gretton, J. Bedo, and K. Borgwardt, “Feature selection via dependence maximization”, J. Machine Learn. Res., vol. 13, pp. 1393–1434, 2012.
  • [12] G. Baudat and F. Anouar, “Feature vector selection and projection using kernels”, Neurocomputing, vol. 55, pp. 21–38, 2003.
  • [13] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, 2nd edit. Wiley, 2000.
  • [14] R. A. Fisher, “The use of multiple measures in taxonomic problems”, Ann. Eugenics, vol. 7, pp. 179–188, 1936.
  • [15] K. Etemad and R. Chellappa, “Discriminant analysis for recognition of human face images,”, J. Optic. Society of America A, vol. 14, pp. 1724–1733, 1997
  • [16] W. Skarbek, K. Kucharski, and M. Bober, “Dual LDA for face recognition”, Fundamenta Informaticae XXI, vol. 1, pp. 1–33, 2001.
  • [17] O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee, “Choosing multiple parameters for support vector machines”, Machine Learn., vol. 46, pp. 131–159, 2002.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-57323781-12f0-47ff-ad6b-fbef61124fd1
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.