Ten serwis zostanie wyłączony 2025-02-11.
Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2023 | R. 99, nr 2 | 109--113
Tytuł artykułu

Development of a detection system for people drowning through aerial images and convolutional neural networks

Wybrane pełne teksty z tego czasopisma
Warianty tytułu
PL
Opracowanie systemu wykrywania tonących osób na podstawie zdjęć lotniczych i konwolucyjnych sieci neuronowych
Języki publikacji
EN
Abstrakty
EN
This work develops a system capable of determining the presence of a person in the water by classifying a convolutional neural network (CNN). When a drowning is alerted, the drone camera takes aerial captures of the water area, the first trained CNN is in charge of determining if it is deep water or not, while the second CNN is in charge of identifying the presence of a person in that video frame; in case both detections are positive, the drone will drop the life ring it carries. The tests confirm that this system is capable of providing a means of survival to the person in an ideal time.
PL
Ta praca rozwija system zdolny do określania obecności osoby w wodzie poprzez klasyfikację konwolucyjnej sieci neuronowej (CNN). W przypadku ostrzeżenia o utonięciu kamera drona wykonuje zdjęcia z powietrza obszaru wodnego, pierwszy przeszkolony CNN jest odpowiedzialny za określenie, czy jest to głęboka woda, czy nie, podczas gdy drugi CNN jest odpowiedzialny za identyfikację obecności osoby w tym ramka wideo; w przypadku, gdy obie detekcje są pozytywne, dron upuści koło ratunkowe, które nosi. Testy potwierdzają, ze system ten jest w stanie zapewnić człowiekowi środki na przeżycie w idealnym czasie.
Wydawca

Rocznik
Strony
109--113
Opis fizyczny
Bibliogr. 28 poz., rys., tab.
Twórcy
Bibliografia
  • [1] World Health Organization. World Report on Drowning. 2016.
  • [2] Anders Bäckman, Jacob Hollenberg, Leif Svensson, Mattias Ringh, Per Nordberg, Therese Djärv, Sune Forsberg, Olof Hernborg, Andreas Claesson: Drones forrovision of Flotation Support in Simulated Drowning.Air Medical Journal, vol. 37,pp 170-173, 2018, doi:https://doi.org/10.1016/j.amj.2018.01.007.
  • [3] Truhlár A, Deakin CD, Soar J, Khalifa GE, Alfonzo A, BierensJJ. European resuscitation council guidelines for resuscitation2015: section 4. Cardiac arrest in special circumstances. Resuscitation. 2015; 95:148–201.
  • [4] Evjemo, L.D., Gjerstad, T., Grøtli, E.I. et al. : Trendsin Smart Manufacturing: Role of Humans and IndustrialRobots in Smart Factories. Curr Robot Rep 1, 35–41 (2020).https://doi.org/10.1007/s43154-020-00006-5.
  • [5] B. Ichter and M. Pavone: "Robot Motion Planning in LearnedLatent Spaces," in IEEE Robotics and Automation Letters, vol.4, no. 3, pp. 2407-2414, July 2019
  • [6] M. A. Andres, L. Pari and S. C. Elvis: "Design of a User Interface to Estimate Distance of Moving Explosive Devices withStereo Cameras," 2021 6th International Conference on Image, Vision and Computing (ICIVC), 2021, pp. 362-366, doi:10.1109/ICIVC52351.2021.9526934.
  • [7] Goyzueta, D.V.; Guevara M., J.; Montoya A., A.; Sulla E.,E.; Lester S., Y.; L., P.; C., E.S. : Analysis of a User Interface Based on Multimodal Interaction to Control a RoboticArm for EOD Applications. Electronics 2022, 11, 1690.https://doi.org/10.3390/electronics11111690
  • [8] Rafael Verano M, Jose Caceres S, Abel Arenas H, AndresMontoya A, Joseph Guevara M, Jarelh Galdos B and Jesus Talavera S : “Development of a Low-Cost Teleoperated Explorer Robot (TXRob)” International Journal of AdvancedComputer Science and Applications (IJACSA), 13(7), 2022.http://dx.doi.org/10.14569/IJACSA.2022.01307104
  • [9] Andres Montoya A., Lizardo Pari P., Erasmo Sulla E., ElvisSupo, C: Assisted operation of a robotic arm based on stereovision for positioning near an explosive device. MDPI Robotics2022.
  • [10] Singh, J.; Gandhi, D.; Sanghani, M.; Robi, P.S.; Dwivedy, S.K: Design and development of underwater robot. In Proceedings of the International Conference on Robotics, Automation,Control and Embedded Systems, RACE 2015, Chennai, India,18–20 February 2015. doi: 10.1109/RACE.2015.7097243.
  • [11] Chu, WS., Lee, KT., Song, SH. et al : Reviewof biomimetic underwater robots using smart actuators.Int. J. Precis. Eng. Manuf. 13, 1281–1292 (2012). doi:https://doi.org/10.1007/s12541-012-0171-7.
  • [12] Ridolfi, A.; Secciani, N.; Stroobant, M.; Franchi, M.; Zacchini,L.; Costanzi, R.; Peralta, G.; Cipriani, L.E. :Marine Robotics for Recurrent Morphological Investigations of Micro-Tidal Marine Coastal Environments. A Point of View. J. Mar. Sci. Eng. 2021,9, 1111. https://doi.org/10.3390/jmse9101111
  • [13] A. S. Lafmejani et al. : "Kinematic Modeling and TrajectoryTracking Control of an Octopus-Inspired Hyper-RedundantRobot," in IEEE Robotics and Automation Letters, vol. 5, no. 2,pp. 3460-3467, April 2020, doi: 10.1109/LRA.2020.2976328.
  • [14] Dominik Fill, Emily Strauss, Cole Dunning, Anastasia Westernand Mostafa Hassanalian :Amphibious Bioinspired Robots for ocean Objects Identification. AIAA, pp 2021-2781. 2021. doi:https://doi.org/10.2514/6.2021-2781.
  • [15] F. Xie, Z. Li, Y. Ding, Y. Zhong and R. Du : "An Experimental Study on the Fish Body Flapping Patterns by Using a Biomimetic Robot Fish," in IEEE Robotics and Automation Letters, vol. 5, no. 1, pp. 64-71, Jan. 2020, doi:10.1109/LRA.2019.2941827.
  • [16] I. Belkin, J. B. de Sousa, J. Pinto, R. Mendes and F.López-Castejón : "Marine robotics exploration of a large scale open-ocean front," 2018 IEEE/OES Autonomous Underwater Vehicle Workshop (AUV), 2018, pp. 1-4, doi:10.1109/AUV.2018.8729725.
  • [17] Joseph, J., Berchmans, N., Varghese, M. R., Krishnan,M., Arya, B. S., Antony, M. : Arduino Based Automatic Human Seeker Robot for Disaster Management.2019 2nd International Conference on Intelligent Computing, Instrumentation and Control Technologies (ICICICT).doi:10.1109/icicict46008.2019.89
  • [18] S. Kang, J. Yu, J. Zhang and Q. Jin : "Development of Multibody Marine Robots: A Review," in IEEE Access, vol. 8, pp.21178-21195, 2020, doi: 10.1109/ACCESS.2020.2969517.
  • [19] Zhixiang Liu, Youmin Zhang, Xiang Yu, Chi Yuan : Unmannedsurface vehicles: An overview of developments and challenges. Annual Reviews in Control, vol. 41, pp 71-93, 2016,doi: https://doi.org/10.1016/j.arcontrol.2016.04.018.
  • [20] Bonfitto A, Amati N. : Mechanical Construction and Propulsion Analysis of a Rescue Underwater Robot in the caseof Drowning Persons. Applied Sciences. 2018; 8(5):693. doi:https://doi.org/10.3390/app8050693
  • [21] Nishan B. Shetty, Naveen Rao, Pruthviraj Umesh, and K.V. Gangadharan : "Remotely operated marine rescue vehicle", AIP Conference Proceedings 2247, 020022 (2020). doi:https://doi.org/10.1063/5.0004147.
  • [22] Wei, Yang Machica, Ivy Dumdumaya, Cristina Arroyo, JanCarlo Delima, Allemar Jhone. (2022) : Liveness Detection Based on Improved Convolutional Neural Network for Face Recognition Security. International Journal of Emerging Technology and Advanced Engineering. 12. 45-53. 10.46338/ijetae0822 06.
  • [23] Chin, Sze Gaik, Tay Chew, Chang Choon Huong, Audrey Rahim, Ruzairi. (2021) : Dorsal hand vein authentication system using artificial neural network. Indonesian Journal of Electrical Engineering and Computer Science. 21. 1837.10.11591/ijeecs.v21.i3.pp1837-1846
  • [24] Minarno, A.E., Wandani, L.R. and Azhar, Y. (2022): “Classification of Breast Cancer Based on Histopathological Image Using EfficientNet-B0 on Convolutional Neural Network,” International journal emerging technology and advanced engineering, 12(8), p. 70. doi:10.46338/ijetae082209.
  • [25] Abdelhafid, E., Aymane, E., Benayad, N., Abdelalim, S., El, Y.A. M. H., Rachid, O. H. T., & Brahim, B. (2022): ECG arrhythmia classification using convolutional neural network. International Journal of Emerging Technology and Advanced Engineering, 12(7), 186-195. doi:10.46338/ijetae072219
  • [26] M. Ribeiro, A. E. Lazzaretti, and H. S. Lopes : “A study of deepconvolutional auto-encoders for anomaly detection in videos,”Pattern Recognit. Lett., vol. 105, pp. 13–22, 2018
  • [27] Ke, Q., Liu, J., Bennamoun, M., An, S., Sohel, F., Boussaid, F. (2018): Computer vision for human-machine interaction. Computer vision for assistive healthcare (pp. 127-145)doi:10.1016/B978-0-12-813445-0.00005-8
  • [28] Ministry of Transport and Communications:Law that regulatesthe use and operations of Remotely Piloted Aircraft Systems (RPAS). Law N. 30740. 2019.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-9f9c5884-dc6e-4c6f-8e30-7e867e5820e4
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.