PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Dimensionality Reduction for Probabilistic Neural Network in Medical Data Classification Problems

Autorzy
Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This article presents the study regarding the problem of dimensionality reduction in training data sets used for classification tasks performed by the probabilistic neural network (PNN). Two methods for this purpose are proposed. The first solution is based on the feature selection approach where a single decision tree and a random forest algorithm are adopted to select data features. The second solution relies on applying the feature extraction procedure which utilizes the principal component analysis algorithm. Depending on the form of the smoothing parameter, different types of PNN models are explored. The prediction ability of PNNs trained on original and reduced data sets is determined with the use of a 10-fold cross validation procedure.
Rocznik
Strony
289–300
Opis fizyczny
Bibliogr. 46 poz., tab., rys., wykr.
Twórcy
autor
  • Faculty of Electrical and Computer Engineering, Rzeszów University of Technology, Powstańców Warszawy 12, 35-959 Rzeszów, Poland
Bibliografia
  • [1] M. A. Little, P. E. McSharry, E. J. Hunter, J. Spielman, and L. O. Ramig, “Suitability of Dysphonia Measurements for Telemonitoring of Parkinson’s Disease,” IEEE Transactions On Biomedical Engineering, vol. 56, no. 4, pp. 1015–1022, 2009.
  • [2] O. L. Mangasarian, W. N. Street, and W. H. Wolberg, “Breast cancer diagnosis and prognosis via linear programming,” Operations Research, vol. 43, no. 4, pp. 570–577, 1995.
  • [3] H. A. Guvenir, G. Demiroz, and N. Ilter, “Learning differential diagnosis of Eryhemato-Squamous diseases using voting feature intervals,” Artificial Intelligence in Medicine, vol. 13, pp. 147–165, 1998.
  • [4] V. Bolon-Canedo, N. Sanchez-Marono, A. Alonso-Betanzos, J. M. Benitez, and F. Herrera, “A review of microarray datasets and applied feature selection methods,” Information Sciences, vol. 282, pp. 111–135, 2014.
  • [5] H. Almuallim and T. G. Dietterich, “Learning with many irrelevant features,” in Proceedings of the Ninth National Conference on Artificial Intelligence, 1991, pp. 547–552.
  • [6] J. R. Quinlan, “Induction of Decision Trees,” Machine Learning, vol. 1, no. 1, pp. 81–106, 1986.
  • [7] G. Pagallo and D. Haussler, “Boolean Feature Discovery In Empirical Learning,” Machine Learning, vol. 5, no. 1, pp. 71–100, 1990.
  • [8] L. Yu and H. Liu, “Feature Selection for High-Dimensional Data: A Fast Correlation-Based Filter Solution,” in Proceedings of the Twentieth International Conference on Machine Learning, 2003, Washington, USA.
  • [9] K. Kira and L. Rendell, “A practical approach to feature selection,” in Proceedings of the Ninth International Conference on Machine Learning, 1992, pp. 249–256.
  • [10] I. Kononenko, “Estimating attributes: Analysis and extensions of RELIEF,” Lecture Notes in Computer Science, vol. 784, pp. 171–182, 1994.
  • [11] M. Robnik-Sikonja and I. Kononenko, “Theoretical and Empirical Analysis of ReliefF and RReliefF,” Machine Learning Journal, vol. 53, pp. 23–69, 2003.
  • [12] C. Cardie, “Using Decision Trees to Improve Case-Based Learning,” in Proceedings of the Tenth International Conference on Machine Learning, 1993, pp. 25–32.
  • [13] M. Singh and G. M. Provan, “Efficient learning of selective Bayesian network classifiers,” in Proceedings of the Thirteenth International Conference on Machine Learning, Morgan Kaufmann, 1996.
  • [14] R. Kohavi and G. H. John “Wrappers for Feature Subset Selection,” Artificial Intelligence, vol. 97, pp. 273–324, 1997.
  • [15] I. Guyon and A. Elisseeff, “An Introduction to Variable and Feature Selection,” Journal of Machine Learning Research, vol. 3, pp. 1157–1182, 2003.
  • [16] I. Guyon, J. Weston, S. Barnhill, and V. Vapnik, “Gene selection for cancer classification using support vector machines,” Machine learning, vol. 46, pp. 389–422, 2002.
  • [17] Y. Saeys, I. Inza, and P. Larranaga, “A review of feature selection techniques in bioinformatics,” Bioinformatics, vol. 23, no. 19, pp. 2507–2517, 2007.
  • [18] A. M. Martinez and A. C. Kak, “PCA versus LDA,” IEEE Transactions On Pattern Analysis And Machine Intelligence, vol. 23, no. 2, pp. 228–233, 2001.
  • [19] K. Delac, M. Grgic, and S. Grgic, “Independent comparative study of PCA, ICA, and LDA on the FERET data set,” International Journal of Imaging Systems and Technology, vol. 15, no. 5, pp. 252–260, 2005.
  • [20] M. Pechenizkiy, “The Impact of Feature Extraction on the Performance of a Classifier: kNN, Na¨ıve Bayes and C4.5,” in B. Kegl and G. Lapalme (Eds.) AI 2005, Lecture Notes in Artificial Intelligene, vol. 3501, Springer-Verlag Berlin Heidelberg, pp. 268–279, 2005.
  • [21] S. Ghosh-Dastidar, H. Adeli, and N. Dadmehr, “Principal Component Analysis-Enhanced Cosine Radial Basis Function Neural Network for Robust Epilepsy and Seizure Detection,” IEEE Transactions On Biomedical Engineering, vol. 55, no. 2, pp. 512–518, 2008.
  • [22] D. F. Specht, “Probabilistic Neural Networks and the Polynomial Adaline as Complementary Techniques for Classification,” IEEE Transaction on Neural Networks, vol. 1, no. 1, pp. 111–121, 1990.
  • [23] C.–J. Huang and W.–C. Liao, “A Comparative Study of Feature Selection Methods for Probabilistic Neural Networks in Cancer Classification,” in 15th IEEE International Conference on Tools with Artificial Intelligence, 2003, pp. 451–458.
  • [24] T. R. Golub, D. K. Slonim, P. Tamayo, C. Huard, M. Gaasenbeek, J. P. Mesirov, H. Coller, M. L. Loh, J. R. Downing, M. A. Caligiuri, C. D. Bloomfield, and E. S. Lander, “Molecular Classification of Cancer: Class Discovery and Class Prediction by Gene Expression Monitoring,” Science, vol. 286, pp. 531–537, 1999.
  • [25] Y. Chtioui, S. Panigrahi, and R. Marsh, “Conjugate gradient and approximate Newton methods for an optimal probabilistic neural network for food color classification,” Optical Engineering, vol. 37, pp. 3015–3023, 1998.
  • [26] M. Kusy and R. Zajdel, “Application of reinforcement learning algorithms for the adaptive computation of the smoothing parameter for probabilistic neural network,” IEEE Transaction on Neural Networks and Learning Systems, vol. 26, no. 9, pp. 2163–2175, 2015.
  • [27] E. B. Hunt, J. Marin, and P. J. Stone, Experiments in induction. New York, USA: Academic Press, 1966.
  • [28] J. H. Friedman, “A recursive partitioning decision rule for nonparametric classification,” IEEE Transactions on Computers, pp. 404–408, 1977.
  • [29] L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and Regression Trees. USA: Chapman and Hall/CRC, 1984.
  • [30] J. R. Quinlan, C4.5: Programs for Machine Learning. San Mateo, USA: Morgan Kaufmann, 1993.
  • [31] J. R. Quinlan, “Improved Use of Continuous Attributes in C4.5,” Journal of Artificial Intelligence Research, vol. 4, pp. 77–90, 1996. [32] L. Breiman, “Random Forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001.
  • [33] K. Pearson, “On Lines and Planes of Closest Fit to Systems of Points in Space,” Philosophical Magazine, vol. 2, no. 11, pp. 559–572, 1901.
  • [34] H. Hotelling, “Analysis of a complex of statistical variables into principal components,” Journal of Educational Psychology, vol. 24, pp. 417–441, 1933.
  • [35] P. H. Sherrod, “DTREG predictive modelling software,” 2015. Available: http://www.dtreg.com
  • [36] D.W. Aha, “Tolerating noisy, irrelevant and novel attributes in instancebased learning algorithms,” International Journal on Man-Machine Studies, vol. 36, pp. 267–287, 1992.
  • [37] S. B. Thrun et al., “The Monk’s problems: a performance comparison of different learning algorithms,” Technical report CMU-CS-91-197, Carnegie Mellon University, Pittsburgh, PA, 1991.
  • [38] I. Maglogiannis, E. Zafiropoulos, and I. Anagnostopoulos, “An intelligent system for automated breast cancer diagnosis and prognosis using SVM based classifiers,” Applied Intelligence, vol. 30, pp. 24–36, 2009.
  • [39] D. Mantzaris, G. Anastassopoulos, and A. Adamopoulos, “Genetic algorithm pruning of probabilistic neural networks in medical disease estimation,” Neural Networks, vol. 24, pp. 831–835, 2011.
  • [40] R. K. Orr, “Use of a Probabilistic Neural Network to Estimate the Risk of Mortality after Cardiac Surgery,” Medical Decision Making, vol. 17, pp. 178–185, 1997.
  • [41] E. Kyriacou, M. S. Pattichis, C. S. Pattichis et al., “Classification of atherosclerotic carotid plaques using morphological analysis on ultrasound images,” Applied Intelligence, vol. 30, pp. 3–23, 2009.
  • [42] S. Ramakrishnan and S. Selvan, “Image texture classification using wavelet based curve fitting and probabilistic neural network,” International Journal of Imaging Systems and Technology, vol. 17, pp. 266–275, 2007.
  • [43] H. Adeli and A. Panakkat, “A probabilistic neural network for earthquake magnitude prediction,” Neural Networks, vol. 22, pp. 1018–1024, 2009.
  • [44] L. Rutkowski, “Adaptive Probabilistic Neural Networks for Pattern Classification in Time-Varying Environment,” IEEE Transactions on Neural Networks, vol. 15, pp. 811–827, 2004.
  • [45] K. Bache and M. Lichman, “UCI Machine Learning Repository,” School of Information and Computer Science, University of California, Irvine, CA, USA, Technical Report, 2013. [Online]. Available: http://archive.ics.uci.edu/ml.
  • [46] M. Pechenizkiy, A. Tsymbal, and S. Puuronen, “PCA–based Feature Transformation for Classification: Issues in Medical Diagnostics,” in Proceedings of the 17th IEEE Symposium on Computer-Based Medical Systems, 2004, pp. 535–540.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-8e432898-8271-4f61-910a-1023d1d44baa
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.