Tytuł artykułu
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Face recognition (FR) is one of the most active research areas in the field of computer vision. Convolutional neural networks (CNNs) have been extensively used in this field due to their good efficiency. Thus, it is important to find the best CNN parameters for its best performance. Hyperparameter optimization is one of the various techniques for increasing the performance of CNN models. Since manual tuning of hyperparameters is a tedious and time-consuming task, population based metaheuristic techniques can be used for the automatic hyperparameter optimization of CNNs. Automatic tuning of parameters reduces manual efforts and improves the efficiency of the CNN model. In the proposed work, genetic algorithm (GA) based hyperparameter optimization of CNNs is applied for face recognition. GAs are used for the optimization of various hyperparameters like filter size as well as the number of filters and of hidden layers. For analysis, a benchmark dataset for FR with ninety subjects is used. The experimental results indicate that the proposed GA-CNN model generates an improved model accuracy in comparison with existing CNN models. In each iteration, the GA minimizes the objective function by selecting the best combination set of CNN hyperparameters. An improved accuracy of 94.5% is obtained for FR.
Rocznik
Tom
Strony
21--31
Opis fizyczny
Bibliogr. 35 poz., rys., tab., wykr.
Twórcy
autor
- Department of Computer Science and Information Technology, University of Jammu, Baba Saheb Ambedkar Road, 180006, Jammu, India
autor
- Department of Computer Science and Information Technology, University of Jammu, Baba Saheb Ambedkar Road, 180006, Jammu, India
autor
- Department of Computer Science and Information Technology, University of Jammu, Baba Saheb Ambedkar Road, 180006, Jammu, India
autor
- Department of Electronics, University of Jammu, Baba Saheb Ambedkar Road, 180006, Jammu, India
Bibliografia
- [1] Albadr, M.A., Tiun, S., Ayob, M. and Al-Dhief, F. (2020). Genetic algorithm based on natural selection theory for optimization problems, Symmetry 12(11): 1758.
- [2] Aszemi, N.M. and Dominic, D.D. (2019). Hyperparameter optimization in convolutional neural network using genetic algorithms, International Journal of Advanced Computer Science and Applications 10(6): 269-278.
- [3] Aydogdu, M.F., Celik, V. and Demirci, M.F. (2017). Comparison of three different CNN architectures for age classification, 11th IEEE International Conference on Semantic Computing (ICSC), San Diego, USA, pp. 372-377.
- [4] Bacanin, N., Zivkovic, M., Salb, M., Strumberger, I. and Chhabra, A. (2021). Convolutional neural networks hyperparameters optimization using sine cosine algorithm, in S. Shakya et al. (Eds), Sentimental Analysis and Deep Learning: Proceedings of ICSADL 2021, Springer, Singapore, pp. 863-878.
- [5] Bergstra, J. and Bengio, Y. (2012). Random search for hyper-parameter optimization, Journal of Machine Learning Research 13(2): 281-305.
- [6] Betró, B. (1991). Bayesian methods in global optimization, Journal of Global Optimization 1: 1-14.
- [7] Bibaeva, V. (2018). Using metaheuristics for hyper-parameter optimization of convolutional neural networks, IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP), Aalborg, Denmark, pp. 1-6.
- [8] Chhabra, Y., Varshney, S. and Ankita (2017). Hybrid particle swarm training for convolution neural network, International Conference on Contemporary Computing, Noida, India, pp. 1–3.
- [9] Cui, H., Bai, J. (2019). A new hyperparameters optimization method for convolutional neural networks, Pattern Recognition Letters 125: 828-834.
- [10] Dargan, S., Kumar, M., Ayyagari, M.R. and Kumar, G. (2020). A survey of deep learning and its applications: A new paradigm to machine learning, Archives of Computational Methods in Engineering 27: 1071-1092.
- [11] Gadekallu, T.R., Alazab, M., Kaluri, R., Maddikunta, P.K.R., Bhattacharya, S., Lakshmanna, K. and Parimala, M. (2021). Hand gesture classification using a novel CNN-crow search algorithm, Complex & Intelligent Systems 7: 1855-1868.
- [12] Guo, G. and Zhang, N. (2019). A survey on deep learning based face recognition, Computer Vision and Image Understanding 189: 102805.
- [13] Hassanat, A., Almohammadi, K., Alkafaween, E., Abunawas, E., Hammouri, A. and Prasath, V.S. (2019). Choosing mutation and crossover ratios for genetic algorithms-A review with a new dynamic approach, Information 10(12): 390.
- [14] Ioffe, S. and Szegedy, C. (2015). Batch normalization: Accelerating deep network training by reducing internal covariate shift, International Conference on Machine Learning, Lille, France, pp. 448-456.
- [15] Karlupia, N., Sambyal, P., Abrol, P. and Lehana, P. (2019). BFO and GA based optimization of illumination switching patterns in large establishments, 6th International Conference on Computing for Sustainable Global Development (INDIACom), New Delhi, India, pp. 349-354.
- [16] Li, Y. and Abdallah., S. (2020). On hyperparameter optimization of machine learning algorithms: Theory and practice, Neurocomputing 415: 295-316.
- [17] Liu, J. and An, F.-P. (2020). Image classification algorithm based on deep learning-kernel function, Scientific Programming 3: 1-14.
- [18] Mahajan, P., Abrol, P. and Lehana, P.K. (2020a). Effect of blurring on identification of aerial images using convolution neural networks, in P.K. Singh et al. (Eds.), proceedings of ICRIC 2019: Recent Innovations in Computing, Springer, Cham, pp. 469-484.
- [19] Mahajan, P., Abrol, P., Lehana, P.K. (2020b). Scene based classification of aerial images using convolution neural networks, Journal of Scientific and Industrial Research 79(12): 1087-1094.
- [20] Mahajan, P., Jakhetiya, V., Abrol, P., Lehana, P., Subudhi, B.N. and Guntuku, S.C. (2021). Perceptual quality evaluation of hazy natural images, IEEE Transactions on Industrial Informatics 17(12): 8046-8056.
- [21] Mahajan, P., Karlupia, N., Abrol, P. and Lehana, P.K. (2021). Identifying COVID-19 pneumonia using chest radiography using deep convolutional neural networks, 62nd International Scientific Conference on Information Technology and Management Science of Riga Technical University (ITMS), Riga, Latvia, pp. 1-6.
- [22] Mohakud, R. and Dash, R. (2021). Designing a grey wolf optimization based hyper-parameter optimized convolutional neural network classifier for skin cancer detection, Journal of King Saud University: Computer and Information Sciences 34(8): 1319-1578.
- [23] Patro, K.K., Prakash, A.J., Samantray, S., Pławiak, J., Tadeusiewicz, R. and Pławiak, P. (2022). A hybrid approach of a deep learning technique for real-time ECG beat detection, International Journal of Applied Mathematics and Computer Science 32(3): 455-465, DOI: 10.34768/amcs-2022-0033.
- [24] Pujol, F.A., Mora, H. and Girona-Selva, J.A. (2016). A connectionist computational method for face recognition, International Journal of Applied Mathematics and Computer Science 26(2): 451-465, DOI: 10.1515/amcs-2016-0032.
- [25] Raju, K., Chinna Rao, B., Saikumar, K. and Lakshman Pratap, N. (2022). An optimal hybrid solution to local and global facial recognition through machine learning, in P. Kumar et al. (Eds), A Fusion of Artificial Intelligence and Internet of Things for Emerging Cyber Systems, Springer, Cham, pp. 203-226.
- [26] Rodrigues, L.F., Backes, A.R., Travençolo, B.A.N. and de Oliveira, G.M.B. (2022). Optimizing a deep residual neural network with genetic algorithm for acute lymphoblastic leukemia classification, Journal of Digital Imaging 35(3): 623-637.
- [27] Rojas, R. (1996). Neural Networks: A Systematic Introduction, Springer, Berlin/Heidelberg.
- [28] Sikha, O. and Bharath, B. (2022). VGG16-random Fourier hybrid model for masked face recognition, Soft Computing 26(22): 12795-12810.
- [28] Suganuma, M., Shirakawa, S. and Nagao, T. (2018). A genetic programming approach to designing convolutional neural network architectures, International Joint Conference on Artificial Intelligence, Stockholm, Sweden, pp. 5369-5373.
- [29] Syulistyo, A.R., Purnomo, D.M.J., Rachmadi, M.F. and Wibowo, A. (2016). Particle swarm optimization (PSO) for training optimization on convolutional neural network (CNN), Complex & Intellelligent Systems 9(1): 52-58.
- [30] Victoria, H. and Maragatham, G. (2021). Automatic tuning of hyperparameters using Bayesian optimization, Evolving Systems 12: 217-223.
- [31] Vose, A., Balma, J., Heye, A., Rigazzi, A., Siegel, C., Moise, D., Robbins, B. and Sukumar, S. R. (2019). Recombination of artificial neural networks, ArXiv: abs/1901.03900.
- [32] Wu, J., Chen, X.-Y., Zhang, H., Xiong, L.-D., Lei, H. and Deng, S.-H. (2019). Hyperparameter optimization for machine learning models based on bayesian optimization, Journal of Electronic Science and Technology 17(1): 26-40.
- [33] Yoo, J.-H., Yoon, H.-i., Kim, H.-G., Yoon, H.-S. and Han, S.-S. (2019). Optimization of hyper-parameter for CNN model using genetic algorithm, 2019 1st International Conference on Electrical, Control and Instrumentation Engineering (ICECIE), Kuala Lumpur, Malaysia, pp. 1-6.
- [34] Wang, Y., Zhang, H. and Zhang, G. (2019). CPSO-CNN: An efficient PSO-based algorithm for fine-tuning hyper-parameters of convolutional neural networks, Swarm and Evolutionary Computation 49: 114-123.
- [35] Zhou, M. (2021). Heuristic hyperparameter optimization for convolutional neural networks using genetic algorithm, arXiv: 2112.07087.
Uwagi
PL
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023)
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-7bdb2adb-5dc6-4037-9bb8-c7e5fcc3d6bf