PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Gabor, LBP, and BSIF features: Which is more appropriate for finger-knuckles-print recognition?

Treść / Zawartość
Identyfikatory
Warianty tytułu
PL
Funkcje Gabor, LBP i BSIF: Która z nich jest bardziej odpowiednia do rozpoznawania odcisków palców i kostek?
Języki publikacji
EN
Abstrakty
EN
An accurate personal identification system helps control access to secure information and data. Biometric technology mainly focuses on the physiological or behavioural characteristics of the human body. This paper investigates the Finger Knuckle Print (FKP) biometric device based on the feature extraction technique. This FKP authentication method includes all the essential processes, such as preprocessing, feature extraction and classification. The features of the FKP application are investigated. Finally, this paper proposes the selection of the best feature extraction based on FKP recognition efficiency. The primary purpose of this paper is to use the Local Binary Patterns (LBP), Binarized Statistical Image Features (BSIF), and Gabor filters and define which helps to increase the False Acceptability Rate (FAR) and Genuine Acceptability Rate (GAR). This latest FKP selection shows better results as this concept shows promising results in recognizing a person's fingerknuckle print.
PL
Dokładny system identyfikacji osobistej pomaga kontrolować dostęp do bezpiecznych informacji i danych. Technologia biometryczna koncentruje się głównie na cechach fizjologicznych lub behawioralnych ludzkiego ciała. W artykule zbadano urządzenie biometryczne typu Finger Knuckle Print (FKP) oparte na technice ekstrakcji cech. Ta metoda uwierzytelniania FKP obejmuje wszystkie niezbędne procesy, takie jak przetwarzanie wstępne, ekstrakcja cech i klasyfikacja. Badane są funkcje aplikacji FKP. Na koniec w artykule zaproponowano wybór najlepszej ekstrakcji cech w oparciu o efektywność rozpoznawania FKP. Głównym celem tego artykułu jest wykorzystanie lokalnych wzorców binarnych (LBP), binarnych cech obrazu statystycznego (BSIF) i filtrów Gabora oraz zdefiniowanie, które pomagają zwiększyć współczynnik fałszywej akceptowalności (FAR) i współczynnik prawdziwej akceptowalności (GAR). Najnowsza selekcja FKP zapewnia lepsze wyniki, ponieważ koncepcja ta zapewnia obiecujące wyniki w rozpoznawaniu odcisków palców danej osoby.
Rocznik
Strony
62--67
Opis fizyczny
Bibliogr. 35 poz., rys., tab.
Twórcy
  • Department of Electronics, LASS Laboratory, Faculty of Technology, University of M’sila, M’sila 28000, Algeria
autor
  • Department of Electronics, LGE Laboratory, Faculty of Technology, University of M’sila, M’sila 28000, Algeria
  • Department of Electronics, LGE Laboratory, Faculty of Technology, University of M’sila, M’sila 28000, Algeria
  • LGE Laboratory, Faculty of Technology, University of M’sila, M’sila 28000, Algeria
Bibliografia
  • [1] N. Nedjah, R. S. Wyant, L. M. Mourelle, and B. B. Gupta, “Efficient fingerprint matching on smart cards for high security and privacy in smart systems,” Information Sciences, vol. 479, pp. 622–639, 2019.
  • [2] A. Grocholewska-czuryło, M. Retinger, “Biometrics as an authentication method in a public key infrastructure,” in Przeglad Elektrotechniczny, 15(2017) N°1, 60-64.
  • [3] L. Chen and Z. Mu, “Partial data ear recognition from one sample per person,” IEEE Transactions on Human-Machine Systems, vol. 46, no. 6, pp. 799–809, 2016.
  • [4] A. Benzaoui, Y. Khaldi, R. Bouaouina, N. Amrouni, H. Alshazly, and A. Ouahabi, “A comprehensive survey on ear recognition: Databases, approaches, comparative analysis, and open challenges,” Neurocomputing, 2023.
  • [5] H. Sellahewa and S. A. Jassim, "Image-quality-based adaptive face recognition," IEEE Transactions on Instrumentation and Measurement, vol. 59, no. 4, pp. 805–813, 2010.
  • [6] D. M. Vo and S.-W. Lee, “Robust face recognition via hierarchical collaborative representation,”Information Sciences, vol. 432, pp. 332–346, 2018.
  • [7] D. Huang, Y. Tang, Y. Wang, L. Chen, and Y. Wang, "Hand dorsal vein recognition by matching local features of multisource key points," IEEE Transactions on Cybernetics, vol. 45, no. 9, pp. 1823–1837, 2014.
  • [8] C.-L. Lin and K.-C. Fan, "Biometric verification using thermal images of palm-dorsal vein patterns," IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, no. 2, pp. 199–213, 2004.
  • [9] H. Qin and M. A. El-Yacoubi, “Deep representation-based feature extraction and recovering for finger-vein verification," IEEE Transactions on Information Forensics and Security, vol. 12, no. 8, pp. 1816–1829, 2017.
  • [10] A. Kumar and Y. Zhou, “Human identification using finger images,” IEEE Transactions on image processing, vol. 21, no. 4, pp. 2228– 2244, 2011.
  • [11] A. Kumar and C. Ravikanth, “Personal authentication using finger knuckle surface,” IEEE Transactions on Information Forensics and Security, vol. 4, no. 1, pp. 98–110, 2009.
  • [12] A. Kumar, S. Garg, and M. Hanmandlu, “Biometric authentication using finger nail plates,” Expert systems with applications, vol. 41, no. 2, pp. 373–386, 2014.
  • [13] H.-A. Park and K. R. Park, “Iris recognition based on score level fusion by using SVM," Pattern Recognition Letters, vol. 28, no. 15, pp. 2019–2028, 2007.
  • [14] A. Kowalczyk, “Support vector machines succinctly,” SyncfusionInc, 2017.
  • [15] N. Poh and S. Bengio, “Database, protocols and tools for evaluating score-level fusion algorithms in biometric authentication,” Pattern Recognition, vol. 39, no. 2, pp. 223 233, 2006.
  • [16] K. Nandakumar, Y. Chen, S. C. Dass, and A. Jain, "Likelihood ratio- based biometric score fusion," IEEE Transactions on pattern analysis and machine intelligence, vol. 30, no. 2, pp. 342–347, 2007.
  • [17] M. He, S.-J. Horng, P. Fan, R.-S. Run, R.-J. Chen, J.-L. Lai,M. K. Khan, and K. O. Sentosa, “Performance evaluation of score level fusion in multimodal biometric systems,” Pattern Recognition, vol. 43, no. 5, pp. 1789–1800, 2010.
  • [18] . Zhang, L. Zhang, D. Zhang, and H. Zhu, "Ensemble of local and global information for finger–knuckle-print recognition," Pattern Recognition, vol. 44, no. 9, pp. 1990–1998, 2011.
  • [19] LK. Nitesh, N. Komal, and K. Neha, “Finger-knuckle-print: a biometric identifier,” Journal of Pattern Intelligence, vol. 2, no. 1, p. 26, 2012.
  • [20] V. N. Vapnik, “The nature of statistical learning,” Theory, 1995.
  • [21] T. R. Gadekallu, N. Khare, S. Bhattacharya, S. Singh, P. K. R. Maddikunta, and G. Srivastava, "Deep neural networks to predict diabetic retinopathy," Journal of Ambient Intelligence and Humanized Computing, pp. 1–14, 2020.
  • [22] A. Muthukumar and A. Kavipriya, “A biometric system based on Gabor feature extraction with SVM classifier for finger knuckle-print," Pattern Recognition Letters, vol. 125, pp. 150 156, 2019.
  • [23] J. N. I. Shayeb, Z. Alqadi, and J. Nader, “Analysis of digital voice features extraction methods,” International Journal of Educational Research and Development, vol. 1, no. 4, pp. 49 55, 2019.
  • [24] B. Attallah, Y. Brik, Y. Chahir, M. Djerioui, and A. Boudjelal, “Fusing palmprint, finger-knuckle-print for bi-modal recognition system based on LBP and bsif,” in 2019 6th International Conference on Image and Signal Processing and their Applications (ISPA). IEEE, 2019, pp. 1–5.
  • [25] B. Attallah, Y. Chahir, and A. Serir, “Geometrical local image descriptors for palmprint recognition," in Image and Signal Processing: 8th International Conference, ICISP 2018, Cherbourg, France, July 2-4, 2018, Proceedings 8. Springer, 2018, pp. 419–426.
  • [26] L. Ali, A. Rahman, A. Khan, M. Zhou, A. Javeed, and J. A. Khan, “An automated diagnostic system for heart disease prediction based on x2 statistical model and optimally configured deep neural network,” Ieee Access, vol. 7, pp. 34 938–34 945, 2019.
  • [27] T. Beghriche, M. Djerioui, Y. Brik, B. Attallah, and S. B. Belhaouari, “An efficient prediction system for diabetes disease based on deep neural network,” Complexity, vol. 2021, pp. 1 14, 2021.
  • [28] P. Pirozmand, M. F. Amiri, F. Kashanchi, and N. Y. Layne, “Age estimation, a Gabor pca-lda approach,” The Journal of Mathematics and Computer Science, vol. 2, no. 2, pp. 233 240, 2011.
  • [29] S. Shan, W. Gao, Y. Chang, B. Cao, and P. Yang, “Review the strength of Gabor features for face recognition from the angle of its robustness to misalignment," in Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004., vol. 1. IEEE, 2004, pp. 338–341.
  • [30] X. Tan and B. Triggs, “Fusing Gabor and LBP feature sets for kernel-based face recognition," in Analysis and Modeling of Faces and Gestures: Third International Workshop, AMFG 2007 Rio de Janeiro, Brazil, October 20, 2007 Proceedings 3. Springer, 2007, pp. 235–249.
  • [31] M. Eismann, "Hyperspectral remote sensing." Society of Photo-Optical Instrumentation Engineers, 2012.
  • [32] H. Jagadeesh, K. S. Babu, and K. Raja, “Dbc based face recognition using dwt,” arXiv preprint arXiv:1205.1644, 2012.
  • [33] W. Nunsong and K. Woraratpanya, "Modified differential box counting method using weighted triangle-box partition," in 2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE). IEEE, 2015, pp. 221–226.
  • [34] W. Nunsong et al., “An improved finger-knuckle-print recognition using fractal dimension based on Gabor wavelet,” in 2016 13th International Joint Conference on Computer Science and Software Engineering (JCSSE). IEEE, 2016, pp. 1–5.
  • [35] A. G. Zuniga, J. B. Florindo, and O. M. Bruno, “Gabor wavelets combined with volumetric fractal dimension applied to texture analysis,” Pattern Recognition Letters, vol. 36, pp. 135–143, 2014
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr POPUL/SP/0154/2024/02 w ramach programu "Społeczna odpowiedzialność nauki II" - moduł: Popularyzacja nauki i promocja sportu (2025).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-467d6d0b-f125-4bf8-8f61-791b552e06ce
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.