PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
  • Sesja wygasła!
Tytuł artykułu

Convergence Analysis for Principal Component Flows

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
A common framework for analyzing the global convergence of several flows for principal component analysis is developed. It is shown that flows proposed by Brockett, Oja, Xu and others are all gradient flows and the global convergence of these flows to single equilibrium points is established. The signature of the Hessian at each critical point is determined.
Twórcy
autor
  • Department of Mathematics, Faculty of Science and Technology, Science University of Tokyo, Noda, Chiba 278, Japan
autor
  • Department of Mathematics, University of Würzburg, D-97074 Würzburg, Germany
autor
  • CITEDI-IPN, Av. del Parque 1310, Tijuana 22510, B.C., Mexico
Bibliografia
  • [1] Baldi P. and Hornik K. (1991): Back-propagation and unsupervised learning in linear net-works, In: Backpropagation: Theory, Architectures and Applications (Y. Chauvin and D.E. Rumelhart, Eds.). - Hillsdale, NJ: Erlbaum Associates.
  • [2] Baldi P. and Hornik K. (1995): Learning in linear neural networks: A survey. - IEEE Trans. Neural Netw., Vol. 6, No. 4, pp. 837-858.
  • [3] Brockett R.W. (1991): Dynamical systems that sort lists, diagonalize matrices and solve linear programming problems. - Lin. Algebra Appl., Vol. 146, pp. 79-91.
  • [4] Helmke U. and Moore J.B. (1994): Dynamical Systems and Optimization. - London: Springer.
  • [5] Łojasiewicz S. (1983): Sur les trajectoires du gradient d’une fonction analytique. - Seminari di Geometria, Bologna, Vol. 15, pp. 115-117.
  • [6] Oja E. (1982): A simplified neuron model as a principal component analyzer. - J. Math. Biol., Vol. 15, No. 3, pp. 267-273.
  • [7] Oja E. and Karhunen J. (1985): On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix. - J. Math. Anal. Appl., Vol. 106, No. 1, pp. 69-84.
  • [8] Oja E. (1989): Neural networks, principal components, and subspaces. - Int. J. Neural Syst., Vol. 1, pp. 61-68.
  • [9] Oja E., Ogawa H. and Wangviwattana J. (1992a): Principal component analysis by homogeneous neural networks, Part I: The weighted subspace criterion. - IEICE Trans. Inf. Syst., Vol. 3, pp. 366-375.
  • [10] Oja E., Ogawa H. and Wangviwattana J. (1992b): Principal component analysis by homogeneous neural networks, Part II: Analysis and extensions of the learning algorithms. - IEICE Trans. Inf. Syst., Vol. 3, pp. 376-382.
  • [11] Sanger T.D. (1989): Optimal unsupervised learning in a single-layer linear feedforward network. - Neural Netw., Vol. 2, No. 6, pp. 459-473.
  • [12] Williams R. (1985): Feature discovery through error-correcting learning. - Tech. Rep. No.8501, University of California, San Diego, Inst. of Cognitive Science.
  • [13] Wyatt J.L. and Elfadel I.M. (1995): Time-domain solutions of Oja’s equations. - Neural Comp., Vol. 7, No. 5, pp. 915-922.
  • [14] Xu L. (1993): Least mean square error recognition principle for self organizing neural nets. - Neural Netw., Vol. 6, No. 5, pp. 627-648.
  • [15] Yan W.Y., Helmke U. and Moore J.B. (1994): Global analysis of Oja’s flow for neural networks. - IEEE Trans. Neural Netw., Vol. 5, No. 5, pp. 674-683.
Uwagi
PL
Research partly supported by the German-Israeli Foundation for Scientific Research and Development under grant GIF-I-526-034.06/97, and by the DFG project 436 RUS 113/275/0(R).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BPZ1-0012-0010
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.