Powiadomienia systemowe
- Sesja wygasła!
Tytuł artykułu
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
The article contains the procedure of image acquisition, including sampling of analyzed material as well as technical solutions of hardware and preprocessing used in research. A dataset of digital images containing identified objects were obtained with help of automated mechanical system for controlling the microscope table and used to train the YOLO models. The performance of YOLOv4 as well as YOLOv8 deep learning networks was compared on the basis of automatic image analysis. YOLO constitutes a one-stage object detection model, aiming to examine the analyzed image only once. By utilizing a single neural network, the image is divided into a grid of cells, and predictions are made for bounding boxes, as well as object class probabilities for each box. This approach allows real-time detection with minimal accuracy loss. The study involved ciliated protozoa Vorticella as a test object. These organisms are found both in natural water bodies and in treatment plants that employ the activated sludge method. As a result of its distinct appearance, high abundance and sedentary lifestyle, Vorticella are good subjects for detection tasks. To ensure that the training dataset is accurate, the images were manually labeled. The performance of the models was evaluated using such metrics as accuracy, precision, and recall. The final results show the differences in metrics characterizing the obtained outputs and progress in the software over subsequent versions of the YOLO algorithm.
Wydawca
Rocznik
Tom
Strony
51--61
Opis fizyczny
Bibliogr. 35 poz., fig.
Twórcy
autor
- Department of Water Supply and Wastewater Disposal, Faculty of Environmental Engineering, Lublin University of Technology, Nadbystrzycka 40B, Lublin, Poland
autor
- Department of Applied Mathematics, Faculty of Mathematics and Information Technology, Lublin University of Technology, Nadbystrzycka 38, Lublin, Poland
autor
- Department of Technical Computer Science, Faculty of Mathematics and Information Technology, Lublin University of Technology, Nadbystrzycka 38, Lublin, Poland
autor
- Department Fauna and Systematics of Invertebrates, National Academy of Sciences of Ukraine, 01030 Kyiv, Ukraine
autor
- Department of Water Supply and Wastewater Disposal, Faculty of Environmental Engineering, Lublin University of Technology, Nadbystrzycka 40B, Lublin, Poland
Bibliografia
- 1. Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. In: IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, 2014; 580–587, https://doi.org/10.1109/CVPR.2014.81.
- 2. Girshick R. Fast R-CNN. In: IEEE International Conference on Computer Vision (ICCV), Santiago, 2015; 1440–1448, https://doi.org/10.1109/ICCV.2015.169.
- 3. He K, Zhang X, Ren S and Sun J. Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015; 37(9): 1904–1916, https://doi.org/10.1109/TPAMI.2015.2389824.
- 4. Papert S. The summer vision project. Massachusetts Institute of Technology, 1966.
- 5. Dziadosz M, Majerek D, Łagód G. Microscopic studies of activated sludge supported by automatic image analysis based on deep learning neural networks. Journal of Ecological Engineering, 2024, 25(4), 360–369.
- 6. Viola P, Jones M. Rapid object detection using a boosted cascade of simple features. In: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognitionn (CVPR), 2001; 1: I-511. https://doi.org/10.1109/CVPR.2001.990517.
- 7. Lowe DG. Distinctive image features from scaleinvariant keypoints. International Journal of Computer Vision, 2004; 60(2): 91–110. https://doi.org/10.1023/B:VISI.0000029664.99615.94.
- 8. Bay H, Ess A, Tuytelaars T, Van Gool L. Speeded-up robust features (SURF). Computer Vision and Image Understanding, 2008; 110(3): 346–359. https://doi.org/10.1016/j.cviu.2007.09.014.
- 9. Krizhevsky A, Sutskever I, Hinton G. Image net classification with deep convolutional neural networks. Neural Information Processing Systems, 2012; 25, https://doi.org/10.1145/3065386.
- 10. Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: unified, real-time object detection, University of Washington, Allen Institute for AI, Facebook AI Research 2016.
- 11. Ren S, He K, Girshick R, Sun J Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017; 39(6), 1137–1149, https://doi.org/10.1109/TPAMI.2016.2577031.
- 12. https://docs.ultralytics.com/models/ (accessed on 14 June 2024).
- 13. Bochkovskiy A, Wang C, Liao HM. YOLOv4: Optimal Speed and Accuracy of Object Detection, arXiv: 2004.10934, 2020.
- 14. Dutta A, Zisserman A. The VIA Annotation Software for Images, Audio and Video. In: Proceedings of the 27th ACM International Conference on Multimedia (MM’19), October 21–25, Nice, France. ACM, New York, NY, USA 2019; 4. https://doi.org/10.1145/3343031.3350535.
- 15. Zhu H, Wang Y, Fan J. IA-Mask R-CNN: Improved anchor design mask R-CNN for surface defect detection of automotive engine parts. Applied Sciences. 2022; 12(13):6633. https://doi.org/10.3390/app12136633
- 16. Lecun Y, Bottou L, Bengio Y, Haffner P Gradientbased learning applied to document recognition. In: Proceedings of the IEEE, 1998; 86(11): 2278–2324. https://doi.org/10.1109/5.726791.
- 17. Lin T, Maire M, Belongie S, Hays J, Perona P, Ramanan D, Dollár P, Zitnick CL. Microsoft COCO: Common objects in context, computer vision – ECCV 2014. Lecture Notes in Computer Science, 8693, Springer, Cham. https://doi.org/10.1007/978-3-319-10602-1_48 (accessed on 25.09.2020).
- 18. Titano JJ, Badgeley M, Scheffleinet J, Pain M, Su A, Cai M, Swinburne N, Zech J, Kim J, Bederson J, Mocco J, Drayer B, Lehar J, Cho S, Costa A, Oermann EK. Automated deep-neural-network surveillance of cranial images for acute neurologic events, Nat Med 2018; 24: 1337–1341. https://doi.org/10.1038/s41591-018-0147-y.
- 19. Redmon J, Farhadi A. YOLOv3: An Incremental Improvement, University of Washington, 2018, arXiv: 1804.02767.
- 20. Liu S, Qi L, Qin H, Shi J and Jia J. Path Aggregation Network for Instance Segmentation, In: IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, 2018; 8759–8768, https://doi.org/0.1109/CVPR.2018.00913.
- 21. Wang C, Mark Liao H, Wu Y, Chen P, Hsieh J, Yeh I CSPNet: A New Backbone that can Enhance Learning Capability of CNN. In: IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 2020; 1571–1580. https://doi.org/10.1109/CVPRW50498.2020.00203.
- 22. Curds CR. 1992. Protozoa and the water industry. I–IV. Cambridge University Press, Cambridge, New York, Sydney.
- 23. Arregui L, Liébana R, Calvo P, Pérez-Uz B, Salvadó H, Serrano S. Bioindication in activated sludge wastewater treatment plants. In: Valdez, CJ, Maradona, EM. (Eds.), Handbook of Wastewater Treatment, 2013; 277–291.
- 24. Li J., Ma L., Wei S., Horn H. Aerobic granules dwelling vorticella and rotifers in an SBR fed with domestic wastewater. Separation Purification Technol. 2013; 110: 127–31.
- 25. Jaromin-Gleń K, Babko R, Łagód G, Sobczuk H Community composition and abundance of protozoa under different concentration of nitrogen compounds at “Hajdow” wastewater treatment plant. Ecological Chemistry and Engineering S. 2013; 20(1): 127–139. https://doi.org/10.2478/eces-2013-0010.
- 26. Madoni P. Protozoa in wastewater treatment processes: A minireview. Italian Journal of Zoology, 2011; 78 (1), 3–11.
- 27. Foissner W. Protists as bioindicators in activated sludge: Identification, ecology and future needs. Eur J Protistol, 2016; 55, 75–94.
- 28. Arregui L, Pérez-Uz B, Salvadó H, Serrano S. Progresses on the knowledge about the ecological function and structure of the protists community in activated sludge wastewater treatment plants. Current Research, Technology and Education Topics in Applied Microbiology and Microbial Biotechnology, 2010; 2(2): 972–979.
- 29. Moreira YC, Cardoso SJ, Siqueira-Castro ICV, Greinert-Goulart JA, Franco RMB, Graco-Roza C, Dias RJP. Ciliate Communities respond via their traits to a wastewater treatment plant with a combined UASB-Activated sludge system. Frontiers in Environmental Science, 2022; 10: 903984.
- 30. Babko R, Kuzmina T, Pliashechnik V, Zaburko J, Szulżyk-Cieplak J, Łagód G. Diversity of Peritricha (Ciliophora) in activated sludge depending on the technology of wastewater treatment. Journal of Ecological Engineering 2024; 25(2): 158–166.
- 31. Pliashechnyk V, Danko Y, Łagód G, Drewnowski J, Kuzmina T, Babko R. Ciliated protozoa in the impact zone of the Uzhgorod treatment plant. E3S Web of Conferences 2018; 30(02008): 1–7.
- 32. Babko R, Pliashechnyk V, Zaburko J, Danko Y, Kuzmina T, Czarnota J, Szulżyk-Cieplak J, Łagód G. Ratio of abundances of ciliates behavioral groups as an indicator of the treated wastewater impact on rivers. PLoS ONE, 2022; 17(10): e0275629.
- 33. https://github.com/RangeKing (accessed on 14 June 2024).
- 34. Jocher G., Chaurasia A., Qiu J. YOLO by Ultralytics (Version 8.0.0) [Computer software]. https://github.com/ultralytics/ultralytics 2023 (accessed on 14 June 2024).
- 35. Pérez-Uz B, Arregui L, Calvo P, Salvadó H, Fernández N, Rodríguez E, Zornoza A, Serrano S Assessment of plausible bioindicators for plant performance in advanced wastewater treatment systems. Water Research, 2010; 44: 5059–5069.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-562da30e-94a0-4a9a-b01e-338fdc5c17c6
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.