PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Incrementally Solving Nonlinear Regression Tasks Using IBHM Algorithm

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper considers the black-box approximation problem where the goal is to create a regression model using only empirical data without incorporating knowledge about the character of nonlinearity of the approximated function. This paper reports on ongoing work on a nonlinear regression methodology called IBHM which builds a model being a combination of weighted nonlinear components. The construction process is iterative and is based on correlation analysis. Due to its iterative nature, the methodology does not require a priori assumptions about the final model structure which greatly simplifies its usage. Correlation based learning becomes ineffective when the dynamics of the approximated function is too high. In this paper we introduce weighted correlation coefficients into the learning process. These coefficients work as a kind of a local filter and help overcome the problem. Proof of concept experiments are discussed to show how the method solves approximation tasks. A brief discussion about complexity is also conducted.
Rocznik
Tom
Strony
65--72
Opis fizyczny
Bibliogr. 13 poz., rys., tab.
Twórcy
autor
Bibliografia
  • [1] J. Arabas and A. Dydyński, “An algorithm of incremental construction of nonlinear parametric approximators”, in Evolutionary Computation and Global Optimization 2006, J. Arabas, Ed. Warsaw: WUT Press, Poland 2006, pp. 31–38.
  • [2] J. Arabas and A. Dydyński, “Nonlinear time-series modeling and prediction using correlation analysis”, in Proc. Appl. Math. Mech. PAMM 2007, vol. 7, pp. 2030013–2030014, 2007.
  • [3] S. Haykin, Neural Networks: A Comprehensive Foundation. Prentice Hall, 1999.
  • [4] P. Zawistowski and J. Arabas, “Benchmarking IBHM method using NN3 competition dataset”, in Proc. Hybrid Artif. Intel. Syst. Conf. HAIS 2011, Wrocław, Poland, 2011. LNCS, Springer, vol. 6678, pp. 263–270, 2011.
  • [5] H. Akaike, “A new look at the statistical model identification”, IEEE Trans. Autom. Contr., vol. 19, no.6, pp 716–723, 1974.
  • [6] B. Sch¨olkopf and A. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, 2002.
  • [7] J. Farlow Stanley, “The GMDH algorithm of Ivakhnenko”, The American Statistician, vol. 35, no. 4, pp. 210–215, 1981.
  • [8] D. Hai, “Effective CLEAN algorithms for performance-enhanced detection of binary coding radar signals”, IEEE Trans. Signal Process., vol. 50, no. 1, pp. 72–78, 2004.
  • [9] T. L. Foreman, “Reinterpreting the CLEAN algorithm as an optimum detector”, in Proc. IEEE Radar Conf., Verona, NY, USA, 2006, pp. 24–27.
  • [10] A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size”, IEEE Congr. Evol. Comput., pp. 1769–1776. 2005.
  • [11] J. A. Nelder and R. Mead, “A simplex method for function minimization”, Comput. J., vol. 7, pp. 308–313, 1965.
  • [12] “Artificial Neural Network and Computational Intelligence Forecasting Competition” [Online]. Available: http://www.neural-forecasting-competition.com/NN3/index.htm
  • [13] S. Fahlman and C. Lebiere, “The cascade-correlation learning architecture”, Adv. Neural Inform. Process. Sys., no. 2, pp. 524–532, 1990.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BATA-0015-0019
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.