PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
Tytuł artykułu

Approximation properties of some two-layer feedforward neural networks

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In this article, we present a multiyariate two-layer feedforward neural networks that approximate continuos functions defined on [0, 1]d. We show that the L1 error of approximation is asymptotically proportional to the modulus of continuity of the underlying function taken at √d/n, where n is the number of function values used.
Rocznik
Strony
59--72
Opis fizyczny
Bibliogr. 14 poz.
Twórcy
autor
Bibliografia
  • [1] G. Anastassiou, Quantitative Approximations, Chapman & Hall/CRC, Boca Raton, 2001.
  • [2] G. Anastassiou, Rate of convergence of some neural network operators to the unit-univariate case, Journal of Mathematical Analysis and Applications 212 (1997), 237-262.
  • [3] A.R. Barron, Universal approximation bounds for superpositions of sigmoid function, IEEE Transactions on Information Theory 3 (1993), 930-945.
  • [4] P. Cardaliaguet, G. Euvrard, Approximation of a function and its derivative with a neural network, Neural Networks 5 (1992), 207-220.
  • [5] G. Cybenko, Approximations by superpositions of a sigmoidal function, Mathematics of Control, Signals and Systems 2 (1989), 303-314.
  • [6] K.I. Funahashi, On the approximate realization of continuos mappings by neural networks, Neural Networks 2 (1989), 183-192.
  • [7] K. Hornik, M. Stinchcombe, H. White, Multilayer feedforward networks are universal approximators, Neural Networks 2 (1989), 359-366.
  • [8] K. Hornik, M. Stinchcombe, H. White, Universal approximation of an unknown mapping and its derivatives, Neural Networks 3 (1989), 551-560.
  • [9] M.A. Kon, L. Plaskota, Information complexity of neural networks, Neural Networks 13 (2000), 365-375.
  • [10] B. Lenze, How to make sigma-pi neural networks perform perfectly on regular training sets, Neural Networks 7 (1994), 1285-1293.
  • [11] B. Lenze, One-sided approximation and interpolation operators generating hyperbolic sigma-pi neural networks, In: Multivariate Approximation and Splines, G. Nurnberger, J. W. Schmidt, G. Walz (eds.), Birkhauser Verlag, Basel 1997, 99-112.
  • [12] Leshno M., Lin V., Pinkus A., Schocken S., Multilayer feedforward networks with non-polynomial activation functions can approximate any continuos function, Neural Networks 6 (1993), 861-867.
  • [13] Scarselli F., Ah Chung Tsoi, Universal approximation using feedforward neural networks: a survey of some existing methods, and some new results, Neural Networks 11 (1998), 15-37.
  • [14] Xi Min Zhanga, Yan Qiu Chenb, Nirwan Ansaria, Yun Q. Shia, Mini-max initialization for function approximation, Neurocomputing 57 (2004), 389-409.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-AGH4-0008-0006
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.