The problem of utilizing atypical neural networks to create a symbolic description of rules governing a set of empirical data is considered. We propose to use fractional–rational or polynomial functions as a versatile tool to describe the unknown empirical–data rules. Our aim is to transform basic forms of these functions to others, suitable for neural implementations, i.e. by means of special–type perceptrons capable of determining the function coefficients in a way of learning the perceptrons. We discuss the issue of effective learning such networks. Important elements in improving the learning efficiency are: a) performing some transformations of the fractional–rational or polynomial functions; b) introducing some additional parameters into them; c) realizing a complex–valued training. These steps enable to eliminate numerical operations on complex numbers from the learning procedure despite the fact that some parameters of the functions are complex–valued ones and are varied in the learning process. Moreover, the made steps lead to eliminating from the training process time consuming operations like using activation functions of the ln(.) and exp(.) type. The proposed approach has proved to be a successful way to increase the learning speed and improve its robustness. We show how the transformations and the additional parameters can be applied to modify one–dimensional fractional–rational as well as one–dimensional polynomial expressions. Perceptron schemes resulting from the obtained expressions are also presented. Furthermore, we discuss properties of the applied learning method and demonstrate the learning effects.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.