PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
Tytuł artykułu

Data augmentation techniques for transfer learning improvement in drill wear classification using convolutional neural network

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper presents an improved method for recognizing the drill state on the basisof hole images drilled in a laminated chipboard, using convolutional neural network (CNN) and data augmentation techniques. Three classes were used to describe the drill state: red - for drill that is worn out and should be replaced, yellow - for state in which the system should send a warning to the operator, indicating that this element should be checked manually, and green - denoting the drill that is still in good condition, which allows for further use in the production process. The presented method combines the advantages of transfer learning and data augmentation methods to improve the accuracy of the received evaluations. In contrast to the classical deep learning methods, transfer learning requires much smaller training data sets to achieve acceptable results. At the same time, data augmentation customized for drill wear recognition makes it possible to expand the original dataset and to improve the overall accuracy. The experiments performed have confirmed the suitability of the presented approach to accurate class recognition in the given problem, even while using a small original dataset.
Twórcy
  • Institute of Information Technology, Warsaw University of Life Sciences - SGGW, Warsaw, Poland
  • Institute of Information Technology, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
  • Institute of Wood Sciences and Furniture, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
  • Institute of Wood Sciences and Furniture, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
  • Institute of Information Technology, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
autor
  • Institute of Information Technology, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
  • Institute of Information Technology, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
autor
  • Institute of Information Technology, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
  • Institute of Information Technology, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
  • Institute of Mechanical Engineering, Warsaw University of Life Sciences – SGGW, Warsaw, Poland
Bibliografia
  • [1] K. Jemielniak, T. Urbański, J. Kossakowska, S. Bombiński. Tool condition monitoring based onnumerous signal features. Int. J. Adv. Manuf. Technol., vol. 59, pp. 73-81, 2012.
  • [2] S. S. Panda, A. K. Singh, D. Chakraborty, S. K. Pal. Drill wear monitoring using back propagationneural network. Journal of Materials Processing Technology, vol. 172, pp. 283-290, 2006.
  • [3] R. J. Kuo. Multi-sensor integration for on-line tool wear estimation through artificial neural net-works and fuzzy neural network. Engineering Applications of Artificial Intelligence, vol. 13, pp.249-261, 2000.
  • [4] J. Kurek, M. Kruk, S. Osowski, P. Hoser, G. Wieczorek, A. Jegorowa, J. Górski, J. Wilkowski, K. Śmietańska, J. Kossakowska. Developing automatic recognition system of drill wear in standardlaminated chipboard drilling process. Bulletin of the Polish Academy of Sciences. Technical Sciences, vol. 64, pp. 633-640, 2016.
  • [5] J. Kurek, G. Wieczorek, M. Kruk, A. Jegorowa, S. Osowski. Transfer learning in recognition of drill wear using convolutional neural network. 18th International Conference on Computational Problems of Electrical Engineering (CPEE) (pp. 1-4). IEEE. September 2017.
  • [6] J. Kurek, B. Swiderski, A. Jegorowa, M. Kruk, S. Osowski. Deep learning in assessment of drill condition on the basis of images of drilled holes. Proc. SPIE 10225 Eighth International Conference on Graphic and Image Processing (ICGIP 2016), pp. 102251V, February 8, 2017.
  • [7] L. Deng, D. Yu. Deep Learning: Methods and Applications. Foundations and Trends in Signal Processing, vol. 7, pp. 3-4, 2014.
  • [8] Y. Bengio. Learning Deep Architectures for AI. Foundations and Trends in Machine Learning, vol. 2, no. 1, pp. 1-127, 2009.
  • [9] I. Goodfellow, Y. Bengio, A. Courville. Deep learning, MIT Press, 2016.
  • [10] J. Schmidhuber. Deep Learning in Neural Networks: An Overview. Neural Networks, vol. 61, pp. 85-117, 2015.
  • [11] A. Krizhevsky, I. Sutskever, G. Hinton. Image net classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, vol. 25, pp. 1-9, 2012.
  • [12] O. Russakovsky, J. Deng, H. Su et al. ImageNet Large Scale Visual Recognition Challenge International Journal of Computer Vision (IJCV), vol. 115, no. 3, pp. 211-252, 2015.
  • [13] BVLC AlexNet Model [online] Available: https://github.com/BVLC/caffe/tree/master/mode1s/bvlc_alexnet.
  • [14] Matlab 2017a – User manual, Natick, MA, USA: The MathWorks, Inc, 2017.
  • [15] J. Kurek, I. Antoniuk, J. Górski, A. Jegorowa, B. Świderski, M. Kruk, G. Wieczorek, J. Pach, A. Orłowski, and J. Aleksiejuk-Gawron. Classifiers ensemble of transfer learning for improved drill wear classification using convolutional neural network. Machine Graphics & Vision, 28 (1/4): 13–13, 2019. http://mgv.wzim.sggw.pl/MGV28.html#1-13.
  • [16] ImageNet, [online] Available: http://www.image-net.org.
  • [17] B. Scholkopf, A. Smola. Learning with Kernels, Cambridge: MIT Press, 2002.
  • [18] M. Kruk, B. Świderski, S. Osowski, J. Kurek, M. Słowińska, I. Walecka. Melanoma recognition using extended set of descriptors and classifiers. Eurasip Journal on Image and Video Processing, vol. 43, pp. 1-10, 2015.
  • [19] V. N. Vapnik. Statistical Learning Theory, New York: Wiley, 1998.
  • [20] Description of Matlab image tranformations. [online] Avaliable: https://www.mathworks.com/help/deeplearning/examples/image-augmentation-using-image-processing-toolbox.html.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-b10d10cd-34c8-4ef5-8823-5417a38689f3
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.