Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!

Znaleziono wyników: 4

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  multiple classifier system
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
EN
Multiple Classifiers Systems (MCSs) very often improve the accuracy of classification when compared with base classifiers. The building of MCSs consists of three phases: generation, selection and integration. The paper presents the two stage dynamic ensemble selection based on the analysis of the discriminant functions. The proposed in the work algorithm is applied to the binary classification tasks. In the integration phase we use the sum rule. Reported results based on the ”Pima” data set show that the proposed two stage ensemble selection is a promising method for the development of MCSs.
EN
The paper presents the dynamic ensemble selection based on the analysis of the decision profiles. These profiles are obtained from a posteriori probability functions returned from the base classifiers during the training process. Presented in the paper dynamic ensemble selection algorithms are dedicated to the binary classification task. In order to verify these algorithms, a number of experiments have been carried out on several medical data sets. The proposed dynamic ensemble selection is experimentally compared against the ensemble with the sum fusion method. As base classifiers we used the pool of homogeneous classifiers. The obtained results are promising because we could improve the classification accuracy of the ensemble classifier.
EN
The selection of classifiers is one of the important problems in the creation of ensemble of classifiers. The paper presents the static selection in which a new method of calculating the weights of individual classifiers is used. The obtained weights can be interpreted in the context of the interval logic. It means that the particular weights will not be provided precisely but their lower and upper values will be used. A number of experiments have been carried out on several medical data sets.
4
EN
This paper presents a significant modification to the AdaSS (Adaptive Splitting and Selection) algorithm, which was developed several years ago. The method is based on the simultaneous partitioning of the feature space and an assignment of a compound classifier to each of the subsets. The original version of the algorithm uses a classifier committee and a majority voting rule to arrive at a decision. The proposed modification replaces the fairly simple fusion method with a combined classifier, which makes a decision based on a weighted combination of the discriminant functions of the individual classifiers selected for the committee. The weights mentioned above are dependent not only on the classifier identifier, but also on the class number. The proposed approach is based on the results of previous works, where it was proven that such a combined classifier method could achieve significantly better results than simple voting systems. The proposed modification was evaluated through computer experiments, carried out on diverse benchmark datasets. The results are very promising in that they show that, for most of the datasets, the proposed method outperforms similar techniques based on the clustering and selection approach.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.