PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Feedforward Neural Networks with Diffused Nonlinear Weight Functions

Autorzy
Identyfikatory
Warianty tytułu
PL
Jednokierunkowe sieci neuronowe z dyfundowanymi funkcjami wag
Języki publikacji
EN
Abstrakty
EN
In this paper, feedforward neural networks are presented that have nonlinear weight functions based on look-up tables, that are specially smoothed in a regularization called the diffusion. The idea of such a type of networks is based on the hypothesis that the greater number of adaptive parameters per a weight function might reduce the total number of the weight functions needed to solve a given problem. Then, if the computational complexity of a propagation trough a single such a weight function would be kept low, then the introduced neural networks might possibly be relatively fast.
PL
W artykule opisane są jednokierunkowe sieci neuronowe z funkcjami wag reprezentowanymi przez tablice, specjalnie wygładzane w regularyzacji zwanej dyfuzją. Pomysł użycia tego typu sieci wynika z hipotezy, że więcej adaptatywnych parametrów na jedną funkcję wagi pozwoli na redukcję liczby tych funkcji. Wówczas, w przypadku gdyby tylko niektóre adaptywne parametry wagi były wykorzystywane w czasie pojedynczej propagacji, także czas propagacji przez jedną wagę byłby względnie krótki, sieci takie mogłyby być więc względnie szybkie.
Rocznik
Strony
33--54
Opis fizyczny
Bibliogr. 21 poz., rys., wykr.
Twórcy
autor
  • Institute of Theoretical and Applied Computer Science of the Polish Academy of Sciences, bałtycka 5, Gliwice, Poland
Bibliografia
  • [1] M. R. Berthold and J. Diamond. Boosting the performance of rbf networks with dynamic decay adjustment. Advances in Neural Information Processing Systems, 7:521-528, 1995.
  • [2] C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, New York, 1995.
  • [3] C. de Boor. A Practical Guide to Splines. Springer Verlag, 1978.
  • [4] S. E. Fahlman and C. Lebiere. The cascade-correlation learning architecture. In D. S. Touretzky, editor, Advances in Neural Information Processing Systems, 2, pages 524-532, San Mateo, 1990. Morgan Kaufmann.
  • [5] T. N. E. Greville. Theory and Applications of Spline Functions. Academic Press, New York, 1969.
  • [6] S. Guarnieri and F. Piazza. Multilayer feed forward networks with adaptive spline activation function. IEEE Transactions on Neural Networks, 10(2):672-683, 1999.
  • [7] S. Guarnieri, F. Piazza, and A. Uncini. Multilayer feed forward networks with adaptive spline activation function. IEEE Transactions on Neural Networks, 10(3):672-683, 1999.
  • [8] J. A. Hertz, A. Krogh, and R. G. Palmer. Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City, CA, 1991.
  • [9] A. Krogh and J. A. Hertz. A simple weight decay can improve generalization. In John E. Moody, Steve J. Hanson, and Richard P. Lippmann, editors, Advances in Neural Information Processing Systems, volume 4, pages 950-957. Morgan Kaufmann Publishers, Inc., 1992.
  • [10] K. J. Lang and M. J. Witbrock. Learning to tell two spirals apart. In Proceedings of the 1988 Connectionist Models Summer School, pages 52-59. Morgan Kaufmann Publishers, Inc., 1988.
  • [11] W. S. McCulloch and W. H. Pitts. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5:115-133, 1943.
  • [12] J. Moody and C. Darken. Fast learning in networks of locally-tuned processing units. Neural Computation, l(2):281-294, 1989.
  • [13] C. Perwass, V. Banarer, and G. Sommer. Spherical decision surfaces using conformal modelling. In DAGM-Symposium, pages 9-16, 2003.
  • [14] F. Piazza, A. Uncini, and M. Zenobi. Neural networks with digital lut activation function. In Proceedings of International Joint Conference on Neural Networks, volume 2, pages 1401-1404, Nagoya, Japan, 1993.
  • [15] D. E. Rumelhart and J. L. McClelland. Explorations in the microstructure of cognition. Parallel Distributed Processing, 1:318-362, 1996.
  • [16] B. Schoelkopf, J. Piatt, J. Shawe-Taylor, A. J. Smola, and R. C. Williamson. Estimating the support of a high-dimensional distribution. Neural Computation, 13:1443-1471, 2001.
  • [17] B. Schoelkopf, A, Smola, R. Williamson, and P. L. Bartlett. New support vector algorithms. Neural Computation, 12:1207-1245, 2000.
  • [18] M. Solazzi and A. Uncini. Artificial neural networks with adaptive multidimensional spline activation function. In Proceedings of the IEEE-INNS-ENNS International Conference of Neural Networks, 2000.
  • [19] A. Uncini, F. Capparelli, and F. Piazza. Fast complex adaptive spline neural networks for digital signal processing. In Proceedings of International Joint Conference on Neural Networks, pages 903-909, 1998.
  • [20] V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995.
  • [21] L. Vecci, F. Piazza, and A. Uncini. Learning and approximation capabilities of adaptive spline activation neural networks. Neural Networks, 11 (2):259-270, 1998.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BUJ3-0002-0042
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.