PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Real-time detection and classification of fish in underwater environment using YOLOv5: a comparative study of deep learning architectures

Treść / Zawartość
Identyfikatory
Warianty tytułu
PL
Wykrywanie i klasyfikacja ryb w czasie rzeczywistym w środowisku podwodnym przy użyciu YOLOv5: badanie porównawcze architekturgłębokiego uczenia
Języki publikacji
EN
Abstrakty
EN
This article explores techniques for the detection and classification of fish as an integral part of underwater environmental monitoring systems. Employing an innovative approach, the study focuses on developing real-time methods for high-precision fish detection and classification. The implementation of cutting-edge technologies, such as YOLO (You Only Look Once) V5, forms the basis for an efficient and responsive system. The study also evaluates various approaches in the context of deep learning to compare the performance and accuracy of fish detection and classification. The results of this research are expected to contribute to the development of more advanced and effective aquatic monitoring systems for understanding underwater ecosystems and conservation efforts.
PL
Niniejszy artykuł bada metody wykrywania i klasyfikacji ryb jako integralną część podwodnych systemów monitorowania środowiska. Wykorzystując innowacyjne podejście, badania koncentrują się na opracowaniu metod w czasie rzeczywistym do bardzo dokładnego wykrywania i klasyfikacji ryb. Wprowadzenie zaawansowanych technologii, takich jak YOLO (You Only Look Once) V5, stanowi podstawę wydajnego i responsywnego systemu. Badanie ocenia również różne podejścia w kontekście głębokiego uczenia się, aby porównać wydajność i dokładność wykrywania i klasyfikacji ryb. Oczekuje się, że wyniki tych badań przyczynią się do rozwoju bardziej zaawansowanych i wydajnych systemów monitorowania zbiorników wodnych w celu zrozumienia podwodnych ekosystemów i wysiłków na rzecz ochrony przyrody.
Rocznik
Strony
91--95
Opis fizyczny
Bibliogr. 29 poz., rys., wykr.
Twórcy
  • Universiti Malaysia Terengganu, Faculty of Ocean Engineering Technology and Informatics, Kuala Nerus, Malaysia
  • Universiti Malaysia Terengganu, Faculty of Ocean Engineering Technology and Informatics, Kuala Nerus, Malaysia
  • Universitas Islam Negeri Sunan Gunung Djati, Department of Physics, Bandung, Indonesia
autor
  • Universiti Sultan Zainal Abidin, Faculty of Informatics and Computing, Campus Besut, Malaysia
  • Universitas Muhammadiyah Tasikmalaya, Department of Mechanical Engineering, Tasikmalaya, Indonesia
  • Yuriy Fedkovych Chernivtsi National University, Department of Radio Engineering and Information Security, Chernivtsi, Ukraine
  • Yuriy Fedkovych Chernivtsi National University, Department of Radio Engineering and Information Security, Chernivtsi, Ukraine
Bibliografia
  • [1] Abdul Aziz M. F. et al.: Development of Smart Sorting Machine using artificial intelligence for Chili Fertigation Industries. Journal of Automation, Mobile Robotics and Intelligent Systems 28, 2022, 44–52 [https://doi.org/10.14313/jamris/4-2021/26].
  • [2] Ayob A. et al.: Analysis of pruned neural networks (mobilenetv2-yolo v2) for underwater object detection. 11th National Technical Seminar on Unmanned System Technology 2019 NUSYS’19, Springer Singapore, Singapore, 2021, 87–98.
  • [3] Boudhane M., Benayad N.: Underwater Image Processing Method for Fish Localization and Detection in Submarine Environment. Journal of Visual Communication and Image Representation 39, 2016, 226–238 [https://doi.org/10.1016/j.jvcir.2016.05.017].
  • [4] Brownscombe J. W. et al.: The Future of Recreational Fisheries: Advances in Science, Monitoring, Management, and Practice. Fisheries Research 211, 2019, 247–255 [https://doi.org/10.1016/j.fishres.2018.10.019].
  • [5] Chen PH. C. et al.: An Augmented Reality Microscope with Real-time Artificial Intelligence Integration for Cancer Diagnosis. Nature Medicine 25(9), 2019, 1453–1457 [https://doi.org/10.1038/s41591-019-0539-7].
  • [6] Du J.: Understanding of Object Detection Based on CNN Family and YOLO. Journal of Physics: Conference Series 1004, 2018, 012029 [https://doi.org/10.1088/1742-6596/1004/1/012029].
  • [7] Fan F.-L. et al.: On Interpretability of Artificial Neural Networks: A Survey. IEEE Transactions on Radiation and Plasma Medical Sciences 5(6), 2021, 741–760 [https://doi.org/10.1109/trpms.2021.3066428].
  • [8] Hong S. et al.: Opportunities and Challenges of Deep Learning Methods for Electrocardiogram Data: A Systematic Review. Computers in Biology and Medicine 122, 2020, 103801
  • [https://doi.org/10.1016/j.compbiomed.2020.103801].
  • [9] Hu J. et al.: Real-time Nondestructive Fish Behavior Detecting in Mixed Polyculture System Using Deep-learning and Low-cost Devices. Expert Systems With Applications 178, 2021, 115051 [https://doi.org/10.1016/j.eswa.2021.115051].
  • [10] Iqbal M. A. et al.: Automatic Fish Species Classification Using Deep Convolutional Neural Networks. Wireless Personal Communications 116(2), 2019, 1043–1053 [https://doi.org/10.1007/s11277-019-06634-1].
  • [11] Isabelle D. A., Westerlund M.: A Review and Categorization of Artificial Intelligence-Based Opportunities in Wildlife, Ocean and Land Conservation. Sustainability 14(4), 2022, 1979 [https://doi.org/10.3390/su14041979].
  • [12] Ismail N., Owais A. M.: Real-time Visual Inspection System for Grading Fruits Using Computer Vision and Deep Learning Techniques. Information Processing in Agriculture 9(1), 2022, 24–37 [https://doi.org/10.1016/j.inpa.2021.01.005].
  • [13] Jalal A. et al.: Fish Detection and Species Classification in Underwater Environments Using Deep Learning with Temporal Information. Ecological Informatics 57, 2020, 101088 [https://doi.org/10.1016/j.ecoinf.2020.101088].
  • [14] Jing L. et al.: Video You Only Look Once: Overall Temporal Convolutions for Action Recognition. Journal of Visual Communication and Image Representation 52, 2018, 58–65 [https://doi.org/10.1016/j.jvcir.2018.01.016].
  • [15] Khan A. N. et al.: Sectorial Study of Technological Progress and CO2 Emission: Insights From a Developing Economy. Technological Forecasting and Social Change 151, 2020, 119862 [https://doi.org/10.1016/j.techfore.2019.119862].
  • [16] Khokher M. R. et al.: Early Lessons in Deploying Cameras and Artificial Intelligence Technology for Fisheries Catch Monitoring: Where Machine Learning Meets Commercial Fishing. Canadian Journal of Fisheries and Aquatic Sciences 79(2), 2022, 257–266 [https://doi.org/10.1139/cjfas-2020-0446].
  • [17] Klapp I. et al.: Ornamental Fish Counting by Non-imaging Optical System for Real-time Applications. Computers and Electronics in Agriculture 153, 2018, 126–133 [https://doi.org/10.1016/j.compag.2018.08.007].
  • [18] Liu H., Lang B.: Machine Learning and Deep Learning Methods for Intrusion Detection Systems: A Survey. Applied Sciences 9(20), 2019, 4396 [https://doi.org/10.3390/app9204396].
  • [19] Mada Sanjaya W. S.: Deep Learning Citra Medis Berbasis Pemrograman Python. Bolabot, 2023.
  • [20] Redmon J. et al.: You Only Look Once: Unified, Real-Time Object Detection. arXiv.org, 8 June 2015, arxiv.org/abs/1506.02640.
  • [21] Reynard D., Shirgaokar M.: Harnessing the Power of Machine Learning: Can Twitter Data Be Useful in Guiding Resource Allocation Decisions During a Natural Disaster? Transportation Research Part D: Transport and Environment 77, 2019, 449–463 [https://doi.org/10.1016/j.trd.2019.03.002].
  • [22] Rico-Díaz Á. J. et al.: An Application of Fish Detection Based on Eye Search With Artificial Vision and Artificial Neural Networks. Water 12(11), 2020, 3013 [https://doi.org/10.3390/w12113013].
  • [23] Sanjaya W. S. et al.: The Design of Face Recognition and Tracking for Human-robot Interaction. 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering – ICITISEE). IEEE, 2017 [https://doi.org/10.1109/icitisee.2017.8285519].
  • [24] Shafiee M. J. et al.: Fast YOLO: A Fast You Only Look Once System for Real-time Embedded Object Detection in Video. arXiv.org, 18 Sept. 2017, arxiv.org/abs/1709.05943.
  • [25] Unlu E. et al.: Deep Learning-based Strategies for the Detection and Tracking of Drones Using Several Cameras. IPSJ Transactions on Computer Vision and Applications 11(1), 2019 [https://doi.org/10.1186/s41074-019-0059-x].
  • [26] Wang D. et al.: UAV Environmental Perception and Autonomous Obstacle Avoidance: A Deep Learning and Depth Camera Combined Solution. Computers and Electronics in Agriculture 175, 2020, 105523 [https://doi.org/10.1016/j.compag.2020.105523].
  • [27] Xiu L. et al.: Fast Accurate Fish Detection and Recognition of Underwater Images With Fast R-CNN. OCEANS 2015 – MTS/IEEE Washington. IEEE, 2015 [https://doi.org/10.23919/oceans.2015.7404464].
  • [28] Zhang L. et al.: Automatic Fish Counting Method Using Image Density Grading and Local Regression. Computers and Electronics in Agriculture 179, 2020, 105844 [https://doi.org/10.1016/j.compag.2020.105844].
  • [29] Zhao Zhong-Qiu et al.: Object Detection With Deep Learning: A Review. IEEE Transactions on Neural Networks and Learning Systems 30(11), 2019, 3212–3232 [https://doi.org/10.1109/tnnls.2018.2876865].
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-d0740742-a648-4953-beab-832b420fe93f
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.