PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Stacking-based multi-objective evolutionary ensemble framework for prediction of diabetes mellitus

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Diabetes mellitus (DM) is a combination of metabolic disorders characterized by elevated blood glucose levels over a prolonged duration. Undiagnosed DM can give rise to a host of associated complications like retinopathy, nephropathy and neuropathy and other vascular abnormalities. In this background, machine learning (ML) approaches can play an essential role in the early detection, diagnosis and therapeutic monitoring of the disease. Recently, several research works have been proposed to predict the onset of DM. To this end, we develop a stacking-based evolutionary ensemble learning system ‘‘NSGA-II-Stacking’’ for predicting the onset of Type-2 diabetes mellitus (T2DM) within five years. For this purpose, publicly accessible Pima Indian diabetes (PID) dataset is utilized. As a data pre-processing step, the missing values and outliers are identified and imputed with the median values. For base learner selection, a multi-objective optimization algorithm is utilized which simultaneously maximizes the classification accuracy and minimizes the ensemble complexity. As for model combination, k-nearest neighbor (K-NN) is employed as a meta-classifier that combines the predictions of the base learners. The comparative results demonstrate that the proposed NSGA-II-Stacking method significantly outperforms several individual ML approaches and conventional ensemble approaches. In terms of performance metrics, the proposed system achieves the highest accuracy of 83.8 %, sensitivity of 96.1 %, specificity of 79.9 %, f-measure of 88.5 % and area under ROC curve of 85.9 %.
Twórcy
  • Department of Computer Science and Engineering, National Institute of Technology, Raipur 492010, Chhattisgarh, India
  • Department of Computer Science and Engineering, National Institute of Technology, Raipur 492010, Chhattisgarh, India
Bibliografia
  • [1] Cho NH, Shaw JE, Karuranga S, Huang Y, da Rocha Fernandes JD, Ohlrogge AW, et al. IDF Diabetes Atlas: global estimates of diabetes prevalence for 2017 and projections for 2045. Diabetes Res Clin Pract 2018;138:311–21.
  • [2] Kavakiotis I, Tsave O, Salifoglou A, Maglaveras N, Vlahavas I, Chouvarda I. Machine learning and data mining methods in diabetes research. Comput Struct Biotechnol J 2017;15:104–16.
  • [3] Alharbi A, Alghahtani M. Using genetic algorithm and ELM neural networks for feature extraction and classification of type 2-diabetes mellitus. Appl Artif Intell 2019;33:311–28. http://dx.doi.org/10.1080/08839514.2018.1560545.
  • [4] Shankaracharya Odedra D, Samanta S, Vidyarthi AS. Computational intelligence in early diabetes diagnosis: a review. Rev Diabet Stud 2010;7:252–62. http://dx.doi.org/10.1900/RDS.2010.7.252.
  • [5] Beloufa F, Chikh MA. Design of fuzzy classifier for diabetes disease using Modified Artificial Bee Colony algorithm. Comput Methods Programs Biomed 2013;112:92–103. http://dx.doi.org/10.1016/j.cmpb.2013.07.009.
  • [6] Kuncheva LI, Bezdek JC, Duin RPW. Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognit 2001;34:299–314. http://dx.doi.org/10.1016/S0031-3203(99)00223-X.
  • [7] Huang M-W, Chen C-W, Lin W-C, Ke S-W, Tsai C-F. SVM and SVM Ensembles in Breast Cancer Prediction. PLoS One 2017;12:e0161501. http://dx.doi.org/10.1371/journal.pone.0161501.
  • [8] Erdem Z, Polikar R, Gurgen F, Yumusak N. Ensemble of SVMs for incremental learning. Int Work Mult Classif Syst 2005;246–56.
  • [9] Kundu S, Ari S. P300 based character recognition using sparse autoencoder with ensemble of SVMs. Biocybern Biomed Eng 2019;1–11. http://dx.doi.org/10.1016/j.bbe.2019.08.001.
  • [10] Al-Khasawneh A. A method for classification using data mining technique for diabetes: a study of health care information system. Int J Healthc Inf Syst Inform 2015;10:1–23. http://dx.doi.org/10.4018/IJHISI.2015070101.
  • [11] Kohavi R, Wolpert DH. Bias plus variance decomposition for zero-one loss functions. 13th Int. Conf. Mach. Learn.; 1996. pp. 275–83.
  • [12] Shunmugapriya P, Kanmani S. Optimization of stacking ensemble configurations through Artificial Bee Colony algorithm. Swarm Evol Comput 2013;12:24–32. http://dx.doi.org/10.1016/j.swevo.2013.04.004.
  • [13] Chen Y, Wong ML, Li H. Applying Ant Colony Optimization to configuring stacking ensembles for data mining. Expert Syst Appl 2014;41:2688–702. http://dx.doi.org/10.1016/j.eswa.2013.10.063.
  • [14] Chen Y, Wong ML. An ant Colony optimization approach for stacking ensemble. Proc. IEEE Second World Congr. Nat. Biol. Inspired Comput.; 2010. pp. 146–51. http://dx.doi.org/10.1109/NABIC.2010.5716282.
  • [15] Ordóñez FJ, Ledezma A, Sanchis A. Genetic approach for optimizing ensembles of classifiers. Proc Twenty-First Int FLAIRS Conf 2008;89–94.
  • [16] Deb K, Pratap A, Agarwal S, Meyarivan T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 2002;6:182–97. http://dx.doi.org/10.1109/4235.996017.
  • [17] Bashir S, Qamar U, Khan FH. IntelliHealth: a medical decision support application using a novel weighted multi-layer classifier ensemble framework. J Biomed Inform 2016;59:185–200. http://dx.doi.org/10.1016/j.jbi.2015.12.001.
  • [18] Alirezaei M, Niaki STA, Niaki SAA. A bi-objective hybrid optimization algorithm to reduce noise and data dimension in diabetes diagnosis using support vector machines. Expert Syst Appl 2019;127:47–57. http://dx.doi.org/10.1016/j.eswa.2019.02.037.
  • [19] Maniruzzaman M, Rahman MJ, Al-MehediHasan M, Suri HS, Abedin MM, El-Baz A, et al. Accurate diabetes risk stratification using machine learning: role of missing value and outliers. J Med Syst 2018;42:92. http://dx.doi.org/10.1007/s10916-018-0940-7.
  • [20] Manikandan S. Measures of dispersion. J Pharmacol Pharmacother 2011;2:315. http://dx.doi.org/10.4103/0976-500X.85931.
  • [21] Zainuri NA, Jemain AA, Muda N. A comparison of various imputation methods for missing values in air quality data. Sains Malays 2015;44:449–56.
  • [22] Baneshi MR, Talei AR. Does the missing data imputation method affect the composition and performance of prognostic models? Iran Red Crescent Med J 2012;14:31–6.
  • [23] Kaiser J. Dealing with missing values in data. J Syst Integr 2014;5:42–51. http://dx.doi.org/10.20470/jsi.v5i1.178.
  • [24] Shrivastava P, Shukla A, Vepakomma P, Bhansali N, Verma K. A survey of nature-inspired algorithms for feature selection to identify Parkinson's disease. Comput Methods Programs Biomed 2017;139:171–9. http://dx.doi.org/10.1016/j.cmpb.2016.07.029.
  • [25] Lin K-C, Hsieh Y-H. Classification of medical datasets using SVMs with hybrid evolutionary algorithms based on endocrine-based particle swarm optimization and artificial bee colony algorithms. J Med Syst 2015;39:119. http://dx.doi.org/10.1007/s10916-015-0306-3.
  • [26] Ye F. Evolving the SVM model based on a hybrid method using swarm optimization techniques in combination with a genetic algorithm for medical diagnosis. Multimed Tools Appl 2018;77:3889–918. http://dx.doi.org/10.1007/s11042-016-4233-1.
  • [27] Jayashree J, Ananda Kumar S. Evolutionary correlated gravitational search algorithm (ECGS) with genetic optimized hopfield neural network (GHNN) – a hybrid expert system for diagnosis of diabetes. Measurement 2019. http://dx.doi.org/10.1016/j.measurement.2018.12.083.
  • [28] Aslam MW, Zhu Z, Nandi AK. Feature generation using genetic programming with comparative partner selection for diabetes classification. Expert Syst Appl 2013;40:5402–12. http://dx.doi.org/10.1016/j.eswa.2013.04.003.
  • [29] Mansourypoor F, Asadi S. Development of a reinforcement learning-based evolutionary fuzzy rule-based system for diabetes diagnosis. Comput Biol Med 2017;91:337–52. http://dx.doi.org/10.1016/j.compbiomed.2017.10.024.
  • [30] Nascimento DSC, Coelho ALV, Canuto AMP. Integrating complementary techniques for promoting diversity in classifier ensembles: a systematic study. Neurocomputing 2014;138:347–57. http://dx.doi.org/10.1016/j.neucom.2014.01.027.
  • [31] Hayashi Y, Yukita S. Rule extraction using Recursive-Rule extraction algorithm with J48graft combined with sampling selection techniques for the diagnosis of type 2 diabetes mellitus in the Pima Indian dataset. Informatics Med Unlocked 2016;2:92–104. http://dx.doi.org/10.1016/j.imu.2016.02.001.
  • [32] Singh N, Singh P, Bhagat D. A rule extraction approach from support vector machines for diagnosing hypertension among diabetics. Expert Syst Appl 2019;130:188–205. http://dx.doi.org/10.1016/j.eswa.2019.04.029.
  • [33] Piri S, Delen D, Liu T, Zolbanin HM. A data analytics approach to building a clinical decision support system for diabetic retinopathy: developing and deploying a model ensemble. Decis Support Syst 2017;101:12–27. http://dx.doi.org/10.1016/j.dss.2017.05.012.
  • [34] Saleh E, Blaszczynski J, Moreno A, Valls A, Romero-Aroca P, de la Riva-Fernández S, et al. Learning ensemble classifiers for diabetic retinopathy assessment. Artif Intell Med 2018;85:50–63. http://dx.doi.org/10.1016/j.artmed.2017.09.006.
  • [35] Singh N, Singh P. A novel Bagged Naïve Bayes-Decision Tree approach for multi-class classification problems. J Intell Fuzzy Syst 2019;36:2261–71. http://dx.doi.org/10.3233/JIFS-169937.
  • [36] Dogantekin E, Dogantekin A, Avci D, Avci L. An intelligent diagnosis system for diabetes on linear discriminant analysis and adaptive network based fuzzy inference system: LDA-ANFIS. Digit Signal Process 2010;20:1248–55. http://dx.doi.org/10.1016/j.dsp.2009.10.021.
  • [37] Nilashi M, Ibrahim O, Dalvi M, Ahmadi H, Shahmoradi L. Accuracy improvement for diabetes disease classification: a case on a public medical dataset. Fuzzy Inf Eng 2017;9:345–57. http://dx.doi.org/10.1016/j.fiae.2017.09.006.
  • [38] Saha S, Mitra S, Yadav RK. A stack-based ensemble framework for detecting Cancer microrna biomarkers. Genomics Proteomics Bioinformatics 2017;15:381–8. http://dx.doi.org/10.1016/j.gpb.2016.10.006.
  • [39] Wang Y, Wang D, Geng N, Wang Y, Yin Y, Jin Y. Stacking-based ensemble learning of decision trees for interpretable prostate cancer detection. Appl Soft Comput 2019;77:188– 204. http://dx.doi.org/10.1016/j.asoc.2019.01.015.
  • [40] Wolpert DH. Stacked generalization. Neural Netw 1992;5:241–59. http://dx.doi.org/10.1016/S0893-6080(05)80023-1.
  • [41] Seewald AK. How to make stacking better and faster while also taking care of an unknown weakness. Proc. Ninet. Int. Conf. Mach. Learn.. Morgan Kaufmann Publishers Inc.; 2002. p. 554–61.
  • [42] Rokach L. Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography. Comput Stat Data Anal 2009;53:4046–72. http://dx.doi.org/10.1016/j.csda.2009.07.017.
  • [43] Džeroski S, Ženko B. Is combining classifiers with stacking better than selecting the best one? Mach Learn 2004;54:255–73. http://dx.doi.org/10.1023/B:MACH.0000015881.36452.6e.
  • [44] Bian Y, Wang Y, Yao Y, Chen H. Ensemble pruning based on objection maximization with a general distributed framework. ArXiv E-Prints 2018. arXiv:1806.04899.
  • [45] Qian C, Yu Y, Zhou ZH. Pareto ensemble pruning. Twenty- Ninth AAAI Conf. Artif. Intell.. 2015;2935–41.
  • [46] Adnan MN, Islam MZ. Optimizing the number of trees in a decision forest to discover a subforest with high ensemble accuracy using a genetic algorithm. Knowledge-Based Syst 2016;110:86–97. http://dx.doi.org/10.1016/j.knosys.2016.07.016.
  • [47] Ali S, Majid A. Can–Evo–Ens: Classifier stacking based evolutionary ensemble system for prediction of human breast cancer using amino acid sequences. J Biomed Inform 2015;54:256–69. http://dx.doi.org/10.1016/j.jbi.2015.01.004.
  • [48] Nguyen TT, Nguyen MP, Pham XC, Liew AWC. Heterogeneous classifier ensemble with fuzzy rule-based meta learner. Inf Sci (Ny) 2018;422:144–60. http://dx.doi.org/10.1016/j.ins.2017.09.009.
  • [49] Fawcett T. An introduction to ROC analysis. Pattern Recognit Lett 2006;27:861–74. http://dx.doi.org/10.1016/j.patrec.2005.10.010.
  • [50] Zenobi G, Cunningham P. Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. Lect Notes Comput Sci 2001;2167:576–87. http://dx.doi.org/10.1007/3-540-44795-4_49.
  • [51] Feller W. An introduction to probability theory and its applications, 2. New York: Wiley; 1971.
  • [52] Ding S, Chen C, Xin B, Pardalos PM. A bi-objective load balancing model in a distributed simulation system using NSGA-II and MOPSO approaches. Appl Soft Comput 2018;63:249–67. http://dx.doi.org/10.1016/j.asoc.2017.09.012.
  • [53] Bouziane H, Messabih B, Chouarfia A. Profiles and majority voting-based ensemble method for protein secondary structure prediction. Evol Bioinforma 2011;2011:171–89. http://dx.doi.org/10.4137/EBO.S7931.
  • [54] Mantovani RG, Rossi ALD, Alcobaça E, Vanschoren J, de Carvalho ACPLF. A meta-learning recommender system for hyperparameter tuning: predicting when tuning improves SVM classifiers. Inf Sci (Ny) 2019. http://dx.doi.org/10.1016/j.ins.2019.06.005.
  • [55] Zhang S, Li X, Zong M, Zhu X, Wang R. Efficient kNN classification with different numbers of nearest neighbors. IEEE Trans Neural Networks Learn Syst 2018;29:1774–85. http://dx.doi.org/10.1109/TNNLS.2017.2673241.
  • [56] Tsang IW, Kwok JT, Cheung P-M. Core vector machines: fast SVM training on very large data sets. J Mach Learn Res 2005;6:363–92.
  • [57] Quinlan JR. C4.5: programs for machine learning. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc.; 1993.
  • [58] Zhang Y, Cao G, Wang B, Li X. A novel ensemble method for k-nearest neighbor. Pattern Recognit 2019;85:13–25. http://dx.doi.org/10.1016/j.patcog.2018.08.003.
  • [59] Blake CL, Merz CJ. UCI repository of machine learning databases; 1998, http://archive.ics.uci.edu/ml/index.php.
  • [60] Mercaldo F, Nardone V, Santone A. Diabetes mellitus affected patients classification and diagnosis through machine learning techniques. Procedia Comput Sci 2017;112:2519–28. http://dx.doi.org/10.1016/j.procs.2017.08.193.
  • [61] Erkaymaz O, Ozer M, Perc M. Performance of small-world feedforward neural networks for the diagnosis of diabetes. Appl Math Comput 2017;311:22–8. http://dx.doi.org/10.1016/j.amc.2017.05.010.
  • [62] Bashir S, Qamar U, Khan FH, Naseem L. HMV: A medical decision support framework using multi-layer classifiers for disease prediction. J Comput Sci 2016;13:10–25. http://dx.doi.org/10.1016/j.jocs.2016.01.001.
  • [63] Alghamdi M, Al-Mallah M, Keteyian S, Brawner C, Ehrman J, Sakr S. Predicting diabetes mellitus using SMOTE and ensemble machine learning approach: the Henry Ford ExercIse Testing (FIT) project. PLoS One 2017;12e0179805. http://dx.doi.org/10.1371/journal.pone.0179805.
  • [64] Zou Q, Qu K, Luo Y, Yin D, Ju Y, Tang H. Predicting diabetes mellitus with machine learning techniques. Front Genet 2018;9:1–10. http://dx.doi.org/10.3389/fgene.2018.00515.
  • [65] Vapnik VN. The nature of statistical learning theory. Springer Verlag Inc.; 1995.
  • [66] Steinwart I, Hush D, Scovel C. An explicit description of the reproducing kernel Hilbert spaces of Gaussian RBF kernels. IEEE Trans Inf Theory 2006;52:4635–43.
  • [67] Ho TK. The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 1998;20:832–44. http://dx.doi.org/10.1109/34.709601.
  • [68] Breiman L. Random forests. Mach Learn 2001;45:5–32. http://dx.doi.org/10.1023/A:1010933404324.
  • [69] Rodríguez JJ, Kuncheva LI, Alonso CJ. Rotation forest: a New classifier ensemble method. IEEE Trans Pattern Anal Mach Intell 2006;28:1619–30. http://dx.doi.org/10.1109/TPAMI.2006.211.
  • [70] Pugh JK, Soros LB, Stanley KO. Quality diversity: a new frontier for evolutionary computation. Front Robot AI 2016;3:1–17. http://dx.doi.org/10.3389/frobt.2016.00040.
  • [71] Bonab H, Can F. Less is more: a comprehensive framework for the number of components of ensemble classifiers. IEEE Trans Neural Networks Learn Syst 2019;1–11. http://dx.doi.org/10.1109/TNNLS.2018.2886341.
  • [72] Domeniconi C, Gunopulos D. Incremental support vector machine construction. Proc. 2001 IEEE Int. Conf. Data Min., IEEE Comput. Soc. 2001;589–92. http://dx.doi.org/10.1109/ICDM.2001.989572.
  • [73] Luukka P, Leppälampi T. Similarity classifier with generalized mean applied to medical data. Comput Biol Med 2006;36:1026–40. http://dx.doi.org/10.1016/j.compbiomed.2005.05.008.
  • [74] Hassab Elgawi O, Hasegawa O. Online incremental random forests. Int. Conf. Mach. Vis.; 2007. pp. 102–6. http://dx.doi.org/10.1109/ICMV.2007.4469281.
  • [75] Park MS, Choi JY, et al. Novel incremental principal component analysis with improved performance. In: da Vitoria Lobo N, Kasparis T, Roli F, Kwok JT, Georgiopoulos M, Anagnostopoulos GC, editors. Struct Syntactic Stat Pattern recognit, vol. 5342. LNCS, Berlin, Heidelberg: Springer Berlin Heidelberg; 2008. p. 592–601. http://dx.doi.org/10.1007/978-3-540-89689-0_63.
  • [76] Polat K, Günes S, Arslan A. A cascade learning system for classification of diabetes disease: generalized discriminant analysis and least square support vector machine. Expert Syst Appl 2008;34:482–7. http://dx.doi.org/10.1016/j.eswa.2006.09.012.
  • [77] Kahramanli H, Allahverdi N. Design of a hybrid system for the diabetes and heart diseases. Expert Syst Appl 2008;35:82–9. http://dx.doi.org/10.1016/j.eswa.2007.06.004.
  • [78] Temurtas H, Yumusak N, Temurtas F. A comparative study on diabetes disease diagnosis using neural networks. Expert Syst Appl 2009;36:8610–5. http://dx.doi.org/10.1016/j.eswa.2008.10.032.
  • [79] Edla DR, Cheruku R. Diabetes-Finder: A Bat Optimized Classification System for Type-2 Diabetes. Procedia Comput Sci 2017;115:235–42. http://dx.doi.org/10.1016/j.procs.2017.09.130.
  • [80] Ledezma A, Aler R, Sanchis A, Borrajo D. Empirical evaluation of optimized stacking configurations. 16th IEEE Int Conf Tools With Artif Intell IEEE Comput Soc 2004;49–55. http://dx.doi.org/10.1109/ICTAI.2004.56.
  • [81] Guzaitis J, Verikas A, Gelzinis A, Bacauskiene M. A framework for designing a fuzzy rule-based classifier. In: Rossi F, Tsoukias A, editors. Int. Conf. Algorithmic decis. Theory. 2009. pp. 434–45.
  • [82] Qasem SN, Shamsuddin SM. Radial basis function network based on time variant multi-objective particle swarm optimization for medical diseases diagnosis. Appl Soft Comput 2011;11:1427–38. http://dx.doi.org/10.1016/j.asoc.2010.04.014.
  • [83] Al-Obeidat F, Belacel N, Carretero JA, Mahanti P. An evolutionary framework using particle swarm optimization for classification method PROAFTN. Appl Soft Comput 2011;11:4971–80. http://dx.doi.org/10.1016/j.asoc.2011.06.003.
  • [84] Qasem SN, Shamsuddin SM, Zain AM. Multi-objective hybrid evolutionary algorithms for radial basis function neural network design. Knowledge-Based Syst 2012;27:475–97. http://dx.doi.org/10.1016/j.knosys.2011.10.001.
  • [85] Qasem SN, Shamsuddin SM, Hashim SZM, Darus M, Al- Shammari E. Memetic multiobjective particle swarm optimization-based radial basis function network for classification problems. Inf Sci (Ny) 2013;239:165–90. http://dx.doi.org/10.1016/j.ins.2013.03.021.
  • [86] Mangat V, Vig R. Dynamic PSO-Based associative classifier for medical datasets. IETE Tech Rev 2014;31:258–65. http://dx.doi.org/10.1080/02564602.2014.942237.
  • [87] Kiziloluk S, Alatas B. Automatic mining of numerical classification rules with parliamentary optimization algorithm. Adv Electr Comput Eng 2015;15:17–24. http://dx.doi.org/10.4316/AECE.2015.04003.
  • [88] Hiew BY, Tan SC, Lim WS. Intra-specific competitive co-evolutionary artificial neural network for data classification. Neurocomputing 2016;185:220–30. http://dx.doi.org/10.1016/j.neucom.2015.12.051.
  • [89] Hiew BY, Tan SC, Lim WS. A double-elimination- tournament-based competitive co-evolutionary artificial neural network classifier. Neurocomputing 2017;249:345–56. http://dx.doi.org/10.1016/j.neucom.2016.11.082.
  • [90] Nalluri MR, Kannan K, Manisha M, Roy DS. Hybrid disease diagnosis using multiobjective optimization with evolutionary parameter optimization. J Healthc Eng 2017;2017:1–27. http://dx.doi.org/10.1155/2017/5907264.
  • [91] Nalluri MR, Kannan K, Gao X, Roy DS. Novel classifiers for intelligent disease diagnosis with multi-objective parameter evolution. Comput Electr Eng 2018;67:483–96. http://dx.doi.org/10.1016/j.compeleceng.2018.01.039.
  • [92] Shen L, Chen H, Yu Z, Kang W, Zhang B, Li H, et al. Evolving support vector machines using fruit fly optimization for medical data classification. Knowledge-Based Syst 2016;96:61–75. http://dx.doi.org/10.1016/j.knosys.2016.01.002.
  • [93] Tabaei BP, Herman WH. A multivariate logistic regression equation to screen for diabetes : development and validation. Diabetes Care 2002;25:1999–2003. http://dx.doi.org/10.2337/diacare.25.11.1999.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-604435d6-61ac-4a79-a57c-2c8620894bb0
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.