Powiadomienia systemowe
- Sesja wygasła!
- Sesja wygasła!
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
The main problem of batch back propagation (BBP) algorithm is slow training and there are several parameters need to be adjusted manually, such as learning rate. In addition, the BBP algorithm suffers from saturation training. The objective of this study is to improve the speed up training of the BBP algorithm and to remove the saturation training. The training rate is the most significant parameter for increasing the efficiency of the BBP. In this study, a new dynamic training rate is created to speed the training of the BBP algorithm. The dynamic batch back propagation (DBBPLR) algorithm is presented, which trains with adynamic training rate. This technique was implemented with a sigmoid function. Several data sets were used as benchmarks for testing the effects of the created dynamic training rate that we created. All the experiments were performed on Matlab. From the experimental results, the DBBPLR algorithm provides superior performance in terms of training, faster training with higher accuracy compared to the BBP algorithm and existing works.
Rocznik
Tom
Strony
82--89
Opis fizyczny
Bibliogr. 30 poz., rys., tab.
Twórcy
autor
- Department of Information Technology, Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin (UniSZA), 21300 Gong Badak, Kuala Nerus, Terengganu
autor
- Department of Information Technology, Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin (UniSZA), 21300 Gong Badak, Kuala Nerus, Terengganu
Bibliografia
- [1] H. Shao, J. Wang, L. Liu, L. D. Xu, and W. Bao, “Relaxed conditions for convergence of batch BPAP for feed forward neural networks”, Neurocomputing, vol. 153, pp. 174–179, 2015 (doi: 10.1016/j.neucom.2014.11.039).
- [2] H. H. Örkcü and H. Bal, “Comparing performances of back propagation and genetic algorithms in the data classification”, Expert Sys. with Applic., vol. 38, no. 4, pp. 3703–3709, 2011 (doi: 10.1016/j.eswa.2010.09.028).
- [3] P. Moallem and S. A. Ayoughi, “Improving back-propagation VIA an efficient combination of a saturation suppression method”, Neural Network World, vol. 2, pp. 207–223, 2010.
- [4] S. M. Shamsuddin and F. I. Saman, “Three Term backpropagation algorithm for classification backpropagation algorithm (BP)”, Neural Network World, vol. 4, no. 10, pp. 363–376, 2007 (doi: 10.1109/NABIC.2009.5393407).
- [5] J. Ge, J. Sha, and Y. Fang, “A new back propagation algorithm with chaotic learning rate”, in Proc. 2010 IEEE Int. Conf. on Softw. Engin. and Serv. Sci., Beijing, China, 2010, pp. 404–407 (doi: 10.1109/ICSESS.2010.5552353).
- [6] D. Xu, H. Shao, and H. Zhang, “A new adaptive momentum algorithm for split-complex recurrent neural networks”, Neurocomputing, vol. 93, pp. 133–136, 2012 (doi: 10.1016/j.neucom2012.03.013).
- [7] G. Yang and F. J. Qian, “A fast and efficient two-phase sequential learning algorithm”, Appl. Soft Comp., vol. 25, no. C, pp. 129–138, 2014 (doi: 10.1016/j.asoc.2014.09.012).
- [8] S. Nandy, W. Bengal, and P. P. Sarkar, “An improved Gauss-Newtons method based back-propagation algorithm for fast convergence”, Int. J. of Comp. Appl. in Technol., vol. 39, no. 8, pp. 1–7, 2012
- [9] M. S. Al Duais and F. S. Mohamad, “A review on enhancements to speed up training of the batch back propagation algorithm”, Ind. J. of Sci. and Technol., vol. 9, no. 46, pp. 1–10, 2016 (doi: 10.17485/ijst/2016/v9i46/91755).
- [10] E. Noersasongko, F. T. Julfia, A. Syukur, R. A. Pramunendar, and C. Supriyanto, “A tourism arrival forecasting using genetic algorithm based neural network”, Ind. J. of Sci. and Technol., vol. 9, no. 4, pp. 1–55, 2016 (doi: 10.17485/ijst/2016/v9i4/78722).
- [11] L. Wang, Y. Zeng, and T. Chen, “Back propagation neural network with adaptive differential evolution algorithm for time series forecasting”, Expert Sys. with Applic., vol. 42, no. 2, pp. 855–863, 2015 (doi: 10.1016/j.eswa.2014.08.018).
- [12] L. Rui, Y. Xiong, K. Xiao, and X. Qiu, “BP neural network-based web service selection algorithm in the smart distribution grid”, in Proc. 16th Asia-Pacific in Netw. Oper. and Manag. Symp. APNOMS, Hsinchu, Taiwan, 2014 (doi: 101109/APNOMS.2014.6996111).
- [13] Q. Abbas, F. Ahmad, and M. Imran, “Variable learning rate based modification in backpropagation algorithm (MBPA) of artificial neural network for data classification”, Sci. Int., vol. 28, no. 3, pp. 2369–2378, 2016.
- [14] H. Zhang, W. Wu, and M. Yao, “Boundedness and convergence of bach back-propagation algorithm with penalty for feed-forward neural networks”, Neurocomputing, vol. 89, pp. 141–146, 2012 (doi: 10.1016/j.neucom.2012.02.029).
- [15] H. Shao, J. Wan , L. Liu, D. Xu, and W. Bao, “Relaxed conditions for convergence of batch BPAP for feed forward neural networks”, Neurocomputing, vol. 153, pp. 174–179, 2015 (doi: 10.1016/j.neucom.2014.11.039).
- [16] Q. Feng and G. Daqi, “Dynamic learning algorithm of multi-layer perceptrons for letter recognition”, The Int. Joint Conf. in Neural Networks IJCNN, pp. 1-6, Dallas, TX, USA, 2013.
- [17] UCI Machine Learning Repository [Online]. Available: https://archive.ics.uci.edu/ml/index.html (accesed Oct. 11 2017).
- [18] N. A. Hamid, N. M. Nawi, R. Ghazali, M. Najib, and M. Salleh, “Improvements of back propagation algorithm performance by adaptively changing gain, momentum, and learning rate”, Int. J. of New Comp. Archit. and their Applic. IJNCAA, vol. 1, no. 2, pp. 889–90, 2011.
- [19] G. Yang and F. J. Qian, “A fast and efficient two-phase sequential learning algorithm”, Appl. Soft Comput., vol. 25, pp. 129–138, 2013 (doi: 10.1016/j.asoc.2014.09.012).
- [20] Y. Huang, “Advances in artificial neural networks – methodological development and application”, Algorithms, vol. 2, no. 3, pp. 973–1007, 2009 (doi: 10.3390/algor2030973).
- [21] A. E, Kostopoulos and T. N. Grapsa, “Self-scaled conjugate gradient training algorithms”, Neurocomputing, vol. 72, no. 13–15, pp. 3000–3019, 2009 (doi: 10.1016/j.neucom.2009.04.006).
- [22] Y. Li, Y. Fu, H. Liand, and S.-W. Zhang, “The improved training algorithm of back propagation neural network with self-adaptive learning rate”, in Proc. Int. Conf. on Comput. Intell. and Natur. Comput., no. 3, Wuhan, Hubei, China, 2009, pp. 1–4 (doi: 10.1109/CINC.2009.111).
- [23] C.-C. Cheung, S.-C. Ng, A. K. Lui, and S. S. Xu., “Enhanced twophase method in fast learning algorithms”, in Proc. Int. Joint Conf. on Neural Networks IJCNN, Barcelona, Spain, 2010, pp. 1–7 (doi: 10.1109/IJCNN.2010.5596519).
- [24] C. Yang and R. Xu, “Adaptation Learning rate algorithm of feedforward neural networks”, in Proc. Int. Conf. on Inf. Engin. and Comp. Sci., Wuhan, Hubei, China, 2009, pp. 1–3 (doi: 10.1109/ICIECS.2009.5366919).
- [25] N. M. Nawi, N. A. Hamid, R. S. Ransing, R. Ghazali, and M. N. M. Salleh, “Propagation neural network algorithm with adaptive gain on classification problems”, Int. J. of Datab. Theory and Applic. vol. 4, no. 2, pp. 65–75, 2011.
- [26] K. Wang, L. Zhuo, H. Lu, H. Guo, L. Xu, and Y. Zhang, “An improved BP algorithm over out-of-order streams for big data”, in Proc. Int. ICST Conf. on Commun. and Network., Guilin, Kuangsi, China, 2013, pp. 840–845 (doi: 10.1109/ChinaCom.2013.6694712).
- [27] Q. Dai, Z. Ma, and Q. Xie, “A two-phased and ensemble scheme integrated ack propagation algorithm”, Appl. Soft Comput., no. 24, pp. 1124–1135, 2014 (doi: 10.1016/j.asoc.2014.08.012).
- [28] S. Scanzio, S. Cumani, R. Gemello, R. F. Mana, and F. Laface “Parallel implementation of neural network training for speech recognition”, Patt. Recog. Lett., vol. 1, no. 11, pp. 1302–1309, 2010 (doi: 10.1016/j.patrec.2010.02.003).
- [29] M. N. Nasr and M. Chtourou, “A self-organizing map-based initialization for hybrid training of feed-forward neutral networks”, Appl. Soft Comput., vol. 11, no. 8, pp. 4458–4464, 2011 (doi: 10.1016/j.asoc.2011.05.017).
- [30] F. Saki, A. Tahmaasbi, H. Soltanian-Zadeh, and S. B. Shokouhi, “Fast opposite weight learning rules with application in breast cancer diagnosis”, Comp. in Biol. and Med., vol. 43, no. 1, pp. 32–41, 2013 (doi: 10.1016/j.compbiomed.2012.10.006).
Uwagi
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2018).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-c7de62b5-a0bc-410a-9154-ff577fba0603
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.