PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Feature Selection via Maximizing Fuzzy Dependency

Autorzy
Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Feature selection is an important preprocessing step in pattern analysis and machine learning. The key issue in feature selection is to evaluate quality of candidate features. In this work, we introduce a weighted distance learning algorithm for feature selection via maximizing fuzzy dependency. We maximize fuzzy dependency between features and decision by distance learning and then evaluate the quality of features with the learned weight vector. The features deriving great weights are considered to be useful for classification learning. We test the proposed technique with some classical methods and the experimental results show the proposed algorithm is effective.
Wydawca
Rocznik
Strony
167--181
Opis fizyczny
Bibliogr. 31 poz., wykr.
Twórcy
autor
autor
autor
autor
autor
Bibliografia
  • [1] Asuncion, A., Newman, D.: UCI machine learning repository, 2007, URL http://www.ics.uci.edu/ mlearn/MLRepository.html.
  • [2] Blum, A., Langley, P.: Selection of relevant features and examples in machine learning, Artificial Intelligence, 97(1-2), 1997, 245 -271.
  • [3] Dash, M., Liu, H.: Consistency-based search in feature selection, Artificial Intelligence, 151(1), 2003, 155 -176.
  • [4] Dubois, D., Prade, H.: Rough fuzzy sets and fuzzy rough sets, International Journal of General Systems, 17(2), 1990, 191 -209.
  • [5] Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection-theory and algorithms,Proceedings of the twenty-first International Conference on Machine Learning, ACM New York, NY, USA, 2004.
  • [6] Hu, Q., Xie, Z., Yu, D.: Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation, Pattern Recognition, 40(12), 2007, 3509 -3521.
  • [7] Hu, Q., Yu, D., Liu, J., Wu, C.: Neighborhood rough set based heterogeneous feature subset selection, Information Sciences, 178(18), 2008, 3577 -3594.
  • [8] Hu, Q., Yu, D., Xie, Z.: Information-preserving hybrid data reduction based on fuzzy-rough techniques, Pattern Recognition Letters, 27(5), 2006, 414 -423.
  • [9] Hu, Q., Yu, D., Xie, Z.: Neighborhood classifiers, Expert Systems with Applications, 34(2), 2008, 866 -876.
  • [10] Hu, Q., Zhao, H., Xie, Z., Yu, D.: Consistency based attribute reduction, Lecture Notes in Computer Science, 4426, 2007, 96-107.
  • [11] Hu, X., Cercone, N.: Data mining via discretization, generalization and rough set feature selection, Knowledge and Information Systems, 1(1), 1999, 33 -60.
  • [12] Jensen, R., Shen, Q.: Semantics-preserving dimensionality reduction: rough and fuzzy-roughbased approaches, IEEE Transactions on Knowledge and Data Engineering, 16(12), 2004, 1457 - 1471.
  • [13] Khan, J., Wei, J., Ringnér, M., Saal, L., Ladanyi, M., Westermann, F., Berthold, F., Schwab, M., Antonescu, C., Peterson, C., et al.: Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks, Nature medicine, 7(6), 2001, 673 -679.
  • [14] Li, G., Meng, H., Lu, W., Yang, J., Yang, M.: Asymmetric bagging and feature selection for activities prediction of drug molecules, BMC Bioinformatics, 9(Suppl 6), 2008, S7.
  • [15] Liu, H., Yu, L.: Toward integrating feature selection algorithms for classification and clustering, IEEE Transactions on Knowledge and Data Engineering, 17(4), 2005, 491 -502.
  • [16] Mi, J., Zhang, W.: An axiomatic characterization of a fuzzy generalization of rough sets, Information Sciences, 160(1-4), 2004, 235 -249.
  • [17] Morsi, N., Yakout, M.: Axiomatics for fuzzy rough sets, Fuzzy Sets and Systems, 100(1-3), 1998, 327 -342.
  • [18] Moser, B.: On Representing and Generating Kernels by Fuzzy Equivalence Relations, Journal of Machine Learning Research, 7 2006, 2603 -2620.
  • [19] Pawlak, Z.: Rough sets: Theoretical aspects of reasoning about data, Kluwer Academic Print on Demand, 1991.
  • [20] Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of maxdependency, max-relevance, and min-redundancy, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 1226 -1238.
  • [21] Perou, C., Sørlie, T., Eisen, M., van de Rijn, M., Jeffrey, S., Rees, C., Pollack, J., Ross, D., Johnsen, H., Akslen, L., et al.: Molecular portraits of human breast tumours, Nature, 406, 2000, 747 -752.
  • [22] Robnik-Šikonja, M., Kononenko, I.: Theoretical and empirical analysis of ReliefF and RReliefF, Machine Learning, 53(1), 2003, 23 -69.
  • [23] Ślęzak, D.: Approximate entropy reducts, Fundamenta Informaticae, 53(3-4), 2002, 365 -390.
  • [24] Ślęzak, D.: Degrees of conditional (in)dependence: A framework for approximate Bayesian networks and examples related to the rough set-based feature selection, Information Sciences, 179(3), 2009, 197 -209.
  • [25] Swiniarski, R., Skowron, A.: Rough set methods in feature selection and recognition, Pattern Recognition Letters, 24(6), 2003, 833 -849.
  • [26] Suraj, Z., Gayar, N. El, Delimata, P.: A rough set approach to multiple classifier systems, Fundamenta Informaticae, 72(1-3), 2006, 393 -406.
  • [27] Vapnik, V.: Statistical learning theory, NY Wiley, 1998
  • [28] Wang, G., Zhao, J., An, J., Wu, Y.: A comparative study of algebra viewpoint and information viewpoint in attribute reduction, Fundamenta Informaticae, 68(3), 2005, 289 -301.
  • [29] Yao, Y., Zhao, Y.: Attribute reduction in decision-theoretic rough set models, Information Sciences, 178(17), 2008, 3356 -3373.
  • [30] Yeung, D., Chen, D., Tsang, E., Lee, J., Wang, X. Z.: On the generalization of fuzzy rough sets, IEEE Transactions on Fuzzy Systems, 13(3), 2005, 343 -361.
  • [31] Zhao, S., Tsang E. C. C., Chen D.: The model of fuzzy variable precision rough sets, IEEE Transactions on Fuzzy Systems, 17(2), 2009, 451-467.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-BUS8-0010-0010
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.