This paper proposes a framework for developing a broad variety of soft clustering and learn-ing vector quantization (LVQ) algorithms based on gradient descent minimization of a refor-mulation function. According to the proposed axiomatic approach to learning vector quantiza-tion, the development of specific algorithm reduces to the selection of a generator function. A linear generator function lead to the fuzzy c-means (FCM) and fuzzy LVQ (FLVQ) algo-rithms while an exponential generator function leads to entropy constrained fuzzy clustering (ECFC) and entropy constrained LVQ (ECLVQ) algorithms. The reformulation of clustering and LVQ algorithms is also extended to supervised learning models through an axiomatic approach proposed for reformulating radial besis function (RBF) neutral networks. This ap-proach results in a broad variety of admissible RBF models, while the form of the radial basis functions is determined by a generator function. This paper shows that gradient descent learn-ing makes reformulated RBF neural networks an attractive alternative to conventional feed-forward neural networks.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.