PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Analysis of the Impact of Data Augmentation on the Performance of Deep Learning Models in Multispectral Food Authenticity Identification

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Food authenticity is a significant concern in the meat industry, demanding effective detection methods. This study explores the use of multispectral imaging (MSI) and deep learning for meat adulteration detection. We evaluate different deep learning models using transfer learning and preprocessing techniques in a multi-level adulteration classification task. In addition, we propose a novel approach called one-band mixed augmentation for band selection in MSI data, which outperforms traditional reflectance-based feature selection and enhances model robustness. Furthermore, employing the nine-crop approach for dataset augmentation improved the accuracy from 0.63 to 0.74 for DenseNet201 model without transfer learning. This research contributes to advancing food safety assessment practices and provides insights into the application of deep learning for preventing food adulteration. The proposed one-band mixed augmentation approach offers a novel strategy for handling band selection challenges in MSI data analysis.
Rocznik
Tom
Strony
823--832
Opis fizyczny
Bibliogr. 24 poz., il., tab., wykr.
Twórcy
autor
  • Department of Advanced Computing Sciences, Faculty of Science and Engineering, Maastricht University, 6200 MD, The Netherlands
autor
  • Department of Advanced Computing Sciences, Faculty of Science and Engineering, Maastricht University, 6200 MD, The Netherlands
autor
  • Department of Advanced Computing Sciences, Faculty of Science and Engineering, Maastricht University, 6200 MD, The Netherlands
  • Department of Advanced Computing Sciences, Faculty of Science and Engineering, Maastricht University, 6200 MD, The Netherlands
Bibliografia
  • 1. C. Yang, G. Zhong, S. Zhou, Y. Guo, D. Pan, S. Wang, Q. Liu, Q. Xia, and Z. Cai, “Detection and characterization of meat adulteration in various types of meat products by using a high-efficiency multiplex polymerase chain reaction technique,” Frontiers in Nutrition, vol. 9, 2022. [Online]. Available: https://www.frontiersin.org/articles/10.3389/fnut.2022.979977
  • 2. K. Edwards, M. Manley, L. C. Hoffman, and P. J. Williams, “Nondestructive spectroscopic and imaging techniques for the detection of processed meat fraud,” Foods, vol. 10, no. 2, p. 448, 2021.
  • 3. P. F. Pereira, P. H. de Sousa Picciani, V. Calado, and R. V. Tonon, “Electrical gas sensors for meat freshness assessment and quality monitoring: A review,” Trends in Food Science & Technology, vol. 118, pp. 36–44, 2021.
  • 4. R. S. Andre, M. H. Facure, L. A. Mercante, and D. S. Correa, “Electronic nose based on hybrid free-standing nanofibrous mats for meat spoilage monitoring,” Sensors and Actuators B: Chemical, vol. 353, p. 131114, 2022.
  • 5. H. J. Marvin, E. M. Janssen, Y. Bouzembrak, P. J. Hendriksen, and M. Staats, “Big data in food safety: An overview,” Critical reviews in food science and nutrition, vol. 57, no. 11, pp. 2286–2295, 2017.
  • 6. M. Peyvasteh, A. Popov, A. Bykov, and I. Meglinski, “Meat freshness revealed by visible to near-infrared spectroscopy and principal component analysis,” Journal of Physics Communications, vol. 4, no. 9, p. 095011, 2020.
  • 7. P. Tsakanikas, L.-C. Fengou, E. Manthou, A. Lianou, E. Z. Panagou, and G.-J. E. Nychas, “A unified spectra analysis workflow for the assessment of microbial contamination of ready-to-eat green salads: Comparative study and application of non-invasive sensors,” Computers and electronics in agriculture, vol. 155, pp. 212–219, 2018.
  • 8. C. H. Q. Rego, F. França-Silva, F. G. Gomes-Junior, M. H. D. d. Moraes, A. D. d. Medeiros, and C. B. d. Silva, “Using multispectral imaging for detecting seed-borne fungi in cowpea,” Agriculture, vol. 10, no. 8, p. 361, 2020.
  • 9. S. Younas, Y. Mao, C. Liu, M. A. Murtaza, Z. Ali, L. Wei, W. Liu, and L. Zheng, “Measurement of water fractions in freeze-dried shiitake mushroom by means of multispectral imaging (msi) and low-field nuclear magnetic resonance (lf-nmr),” Journal of Food Composition and Analysis, vol. 96, p. 103694, 2021.
  • 10. L.-C. Fengou, E. Spyrelli, A. Lianou, P. Tsakanikas, E. Z. Panagou, and G.-J. E. Nychas, “Estimation of minced pork microbiological spoilage through fourier transform infrared and visible spectroscopy and multispectral vision technology,” Foods, vol. 8, no. 7, p. 238, 2019.
  • 11. M. Kamruzzaman, Y. Makino, and S. Oshita, “Rapid and non-destructive detection of chicken adulteration in minced beef using visible near-infrared hyperspectral imaging and machine learning,” Journal of Food Engineering, vol. 170, pp. 8–15, 2016.
  • 12. A. I. Ropodi, E. Z. Panagou, and G.-J. E. Nychas, “Multispectral imaging (msi): A promising method for the detection of minced beef adulteration with horsemeat,” Food Control, vol. 73, pp. 57–63, 2017.
  • 13. A. I. Ropodi, E. Z. Panagou, and G.-J. E. Nychas, “Rapid detection of frozen-then-thawed minced beef using multispectral imaging and fourier transform infrared spectroscopy,” Meat science, vol. 135, pp. 142–147, 2018.
  • 14. L.-C. Fengou, A. Lianou, P. Tsakanikas, F. Mohareb, and G.-J. E. Nychas, “Detection of meat adulteration using spectroscopy-based sensors,” Foods, vol. 10, no. 4, p. 861, 2021.
  • 15. X. Li, X. Fan, L. Zhao, S. Huang, Y. He, and X. Suo, “Discrimination of pepper seed varieties by multispectral imaging combined with machine learning,” Applied Engineering in Agriculture, vol. 36, no. 5, pp. 743–749, 2020.
  • 16. M. A. Tamayo-Monsalve, E. Mercado-Ruiz, J. P. Villa-Pulgarin, M. A. Bravo-Ortíz, H. B. Arteaga-Arteaga, A. Mora-Rubio, J. A. Alzate-Grisales, D. Arias-Garzon, V. Romero-Cano, S. Orozco-Arias et al., “Coffee maturity classification using convolutional neural networks and transfer learning,” IEEE Access, vol. 10, pp. 42 971–42 982, 2022. doi: 10.1109/ACCESS.2022.3166515
  • 17. F. Chollet, “Keras,” https://github.com/fchollet/keras, 2015.
  • 18. K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint https://arxiv.org/abs/1409.1556, 2014.
  • 19. C. Szegedy, S. Ioffe, V. Vanhoucke, and A. Alemi, “Inception-v4, inception-resnet and the impact of residual connections on learning,” in Proceedings of the AAAI conference on artificial intelligence, vol. 31, no. 1, 2017.
  • 20. C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going deeper with convolutions,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1–9.
  • 21. G. Huang, Z. Liu, L. Van Der Maaten, and K. Q. Weinberger, “Densely connected convolutional networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 4700–4708.
  • 22. O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, A. C. Berg, and L. Fei-Fei, “Imagenet large scale visual recognition challenge,” 2014. [Online]. Available: https://arxiv.org/abs/1409.0575
  • 23. L.-C. Fengou, P. Tsakanikas, and G.-J. E. Nychas, “Rapid detection of minced pork and chicken adulteration in fresh, stored and cooked ground meat,” Food Control, vol. 125, p. 108002, 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0956713521001407
  • 24. K. He, R. Girshick, and P. Dollár, “Rethinking imagenet pre-training,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019, pp. 4918–4927.
Uwagi
1. Thematic Tracks Regular Papers
2. Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-cad476c4-3717-4a69-8d32-db6ae4a9fd75
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.