PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

IoT-based real-time passenger safety system with Machine Vision at the Edge (Mez) technology

Treść / Zawartość
Identyfikatory
Warianty tytułu
PL
System bezpieczeństwa pasażerów w czasie rzeczywistym oparty na IoT z technologią Machine Vision at the Edge (Mez)
Języki publikacji
EN
Abstrakty
EN
Passenger safety is a critical issue in the transportation industry. There is an additional concern for children’s safety with other regular issues. And that is the risk of having accidents because of the childish act of putting hands, heads, or the upper half of the body out of the window. Children are curious and fun-loving and enjoy school bus time with their friends. Their activities are not always pleasant to adults, which easily distract drivers. That is why the stable emotional state of the drivers of school buses is essential. This paper presents an IoT-based innovative passenger safety system developed to keep the safety concerns associated with school buses. The IR-based sensor in this project prevents passengers from crossing safety limits outside the window. A well-optimized Convolutional Neural Network (CNN) has been designed and developed in this paper to predict the risk level by reading the emotional states of the driver. Real-time video transmission is essential to recognize the driver’s emotional state. However, it is severely hampered by network latency. This paper incorporates the Machine Vision at the Edge (Mez) technology to solve the latency issue and effectively detect the driver’s emotion in real time. This innovative safety system is a potential solution to the unaddressed safety concern of children’s school buses. This paper’s unique approach to solving a practical problem strengthens bus passenger safety.
PL
Bezpieczeństwo pasażerów jest kluczową kwestią w branży transportowej. Dodatkową troską o bezpieczeństwo dzieci są inne regularne kwestie. I to jest ryzyko wypadku na skutek dziecinnego wystawiania rąk, głów lub górnej części ciała przez okno. Dzieci są ciekawskie i kochają zabawę. Lubią spędzać czas w autobusie szkolnym z przyjaciółmi. Ich zajęcia nie zawsze są przyjemne dla dorosłych, co łatwo odwraca uwagę kierowców. Dlatego tak istotny jest stabilny stan emocjonalny kierowców autobusów szkolnych. W artykule przedstawiono innowacyjny system bezpieczeństwa pasażerów oparty na IoT, opracowany w celu spełnienia wymagań bezpieczeństwa związanych z autobusami szkolnymi. Zastosowany w tym projekcie czujnik podczerwieni zapobiega przekraczaniu przez pasażerów granic bezpieczeństwa za oknem. W tym artykule zaprojektowano i rozwinięto dobrze zoptymalizowaną konwolucyjną sieć neuronową (CNN), aby przewidywać poziom ryzyka na podstawie odczytu stanów emocjonalnych kierowcy. Transmisja wideo w czasie rzeczywistym jest niezbędna do rozpoznania stanu emocjonalnego kierowcy. Jednakże jest to poważnie utrudniane przez opóźnienia sieci. W artykule wykorzystano technologię Machine Vision at the Edge (Mez), aby rozwiązać problem opóźnień i skutecznie wykrywać emocje kierowcy w czasie rzeczywistym. Ten innowacyjny system bezpieczeństwa jest potencjalnym rozwiązaniem nierozwiązanych problemów związanych z bezpieczeństwem autobusów szkolnych dla dzieci. Unikalne podejście niniejszego artykułu do rozwiązania praktycznego problemu zwiększa bezpieczeństwo pasażerów autobusów.
Rocznik
Strony
159--167
Opis fizyczny
Bibliogr. 35 poz., rys., tab.
Bibliografia
  • [1] Z. S. Khan, W. He, and M. Menendez, “Application of modular vehicle´ technology to mitigate bus bunching,” Transportation Research Part C: Emerging Technologies, vol. 146, p. 103953, 2023.
  • [2] C. Utary, D. S. Nabahan, E. Budianto, and S. N. I. Sari, “Planning of school safety zone (zoss) on the education road of merauke regency,” in MATEC Web of Conferences, vol. 372, p. 07004, EDP Sciences, 2022.
  • [3] J. Hinton, B. Watson, and O. Oviedo-Trespalacios, “A novel conceptual framework investigating the relationship between roadside advertising and road safety: the driver behaviour and roadside advertising conceptual framework,” Transportation research part F: traffic psychology and behaviour, vol. 85, pp. 221–235, 2022.
  • [4] J. Ahmed, N. Ward, J. Otto, and A. McMahill, “How does emotional intelligence predict driving behaviors among non-commercial drivers?,” Transportation research part F: traffic psychology and behaviour, vol. 85, pp. 38–46, 2022.
  • [5] S. Saurav, P. Gidde, R. Saini, and S. Singh, “Dual integrated convolutional neural network for real-time facial expression recognition in the wild,” The Visual Computer, pp. 1–14, 2022.
  • [6] Z. Lei, S. Ren, Y. Hu, W. Zhang, and S. Chen, “Latency-aware collaborative perception,” in Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXXII, pp. 316–332, Springer, 2022.
  • [7] A. George, A. Ravindran, M. Mendieta, and H. Tabkhi, “Mez: An adaptive messaging system for latency-sensitive multi-camera machine vision at the iot edge,” IEEE Access, vol. 9, pp. 21457–21473, 2021.
  • [8] S. Huai, D. Liu, H. Kong, W. Liu, R. Subramaniam, C. Makaya, and Q. Lin, “Latency-constrained dnn architecture learning for edge systems using zerorized batch normalization,” Future Generation Computer Systems, vol. 142, pp. 314–327, 2023.
  • [9] Y.-C. Chen, E. M. Nahum, R. J. Gibbens, D. Towsley, and Y.-s. Lim, “Characterizing 4g and 3g networks: Supporting mobility with multipath tcp,” School of Computer Science, University of Massachusetts Amherst, Tech. Rep, vol. 22, 2012.
  • [10] A. Imteaj, U. Thakker, S. Wang, J. Li, and M. H. Amini, “A survey on federated learning for resource-constrained iot devices,” IEEE Internet of Things Journal, vol. 9, no. 1, pp. 1– 24, 2021.
  • [11] D.-R. Berte, “Defining the iot,” in Proceedings of the international conference on business excellence, vol. 12, pp. 118–128, 2018.
  • [12] A. Colakoviˇ c and M. Had´ zialiˇ c, “Internet of things (iot): A review of´ enabling technologies, challenges, and open research issues,” Computer networks, vol. 144, pp. 17–39, 2018.
  • [13] L. Babangida, T. Perumal, N. Mustapha, and R. Yaakob, “Internet of things (iot) based activity recognition strategies in smart homes: A review,” IEEE Sensors Journal, 2022.
  • [14] S. Jha, N. Jha, D. Prashar, S. Ahmad, B. Alouffi, and A. Alharbi, “Integrated iot-based secure and efficient key management framework using hashgraphs for autonomous vehicles to ensure road safety,” Sensors, vol. 22, no. 7, p. 2529, 2022.
  • [15] N. Faruqui, M. A. Yousuf, M. Whaiduzzaman, A. Azad, A. Barros, and M. A. Moni, “Lungnet: A hybrid deep-cnn model for lung cancer diagnosis using ct and wearable sensor-based medical iot data,” Computers in Biology and Medicine, vol. 139, p. 104961, 2021.
  • [16] H. T. Truong, B. P. Ta, Q. A. Le, D. M. Nguyen, C. T. Le, H. X. Nguyen, H. T. Do, H. T. Nguyen, and K. P. Tran, “Light-weight federated learning-based anomaly detection for time-series data in industrial control systems,” Computers in Industry, vol. 140, p. 103692, 2022.
  • [17] M. S. Farooq, S. Riaz, A. Abid, T. Umer, and Y. B. Zikria, “Role of iot technology in agriculture: A systematic literature review,” Electronics, vol. 9, no. 2, p. 319, 2020.
  • [18] S. Muthuramalingam, A. Bharathi, S. Rakesh Kumar, N. Gayathri, R. Sathiyaraj, and B. Balamurugan, “Iot based intelligent transportation system (iot-its) for global perspective: A case study,” Internet of Things and Big Data Analytics for Smart Generation, pp. 279–300, 2019.
  • [19] M. F. Elahe, M. Jin, and P. Zeng, “Knowledge-based systematic feature extraction for identifying households with plug-in electric vehicles,” IEEE Transactions on Smart Grid, vol. 13, no. 3, pp. 2259–2268, 2022.
  • [20] T. Boshita, H. Suzuki, and Y. Matsumoto, “Iot-based bus location system using lorawan,” in 2018 21st International Conference on Intelligent Transportation Systems (ITSC), pp. 933–938, IEEE, 2018.
  • [21] S. Geetha and D. Cicilia, “Iot enabled intelligent bus transportation system,” in 2017 2nd International Conference on Communication and Electronics Systems (ICCES), pp. 7– 11, IEEE, 2017.
  • [22] A. Awajan, “A novel deep learning-based intrusion detection system for iot networks,” Computers, vol. 12, no. 2, p. 34, 2023.
  • [23] J. Jabamony and G. R. Shanmugavel, “Iot based bus arrival time prediction using artificial neural network (ann) for smart public transport system (spts),” International Journal of Intelligent Engineering & Systems, vol. 13, 2019.
  • [24] K. Sridevi, A. Jeevitha, K. Kavitha, K. Sathya, and K. Narmadha, “Smart bus tracking and management system using iot,” Asian Journal of Applied Science and Technology (AJAST) Volume, vol. 1, 2017.
  • [25] Y. Zhu, F. Yan, J.-S. Pan, L. Yu, Y. Bai, W. Wang, C. He, and Z. Shi, “Mutigroup-based phasmatodea population evolution algorithm with mutistrategy for iot electric bus scheduling,” Wireless Communications and Mobile Computing, vol. 2022, 2022.
  • [26] R. S. Krishnan, A. Kannan, G. Manikandan, S. S. KB, V. K. Sankar, and K. L. Narayanan, “Secured college bus management system using iot for covid-19 pandemic situation,” in 2021 third international conference on intelligent communication technologies and virtual mobile networks (ICICV), pp. 376–382, IEEE, 2021.
  • [27] A. Rahmatulloh, F. M. Nursuwars, I. Darmawan, and G. Febrizki, “Applied internet of things (iot): the prototype bus passenger monitoring system using pir sensor,” in 2020 8th International Conference on Information and Communication Technology (ICoICT), pp. 1–6, IEEE, 2020.
  • [28] W. Mellouk and W. Handouzi, “Facial emotion recognition using deep learning: review and insights,” Procedia Computer Science, vol. 175, pp. 689–694, 2020.
  • [29] M. S. Arman, M. R. Alam, H. Jahan, L. Islam, M. H. Sammak, and K. B. M. B. Biplob, “A data mining approach to finding face mask from bangladeshi news channel,” in 2022 13th International Conference on Computing Communication and Networking Technologies (ICCCNT), pp. 1–7, IEEE, 2022.
  • [30] I. Adjabi, A. Ouahabi, A. Benzaoui, and A. Taleb-Ahmed, “Past, present, and future of face recognition: A review,” Electronics, vol. 9, no. 8, p. 1188, 2020.
  • [31] F. Z. Canal, T. R. Muller, J. C. Matias, G. G. Scotton, A. R. de Sa Junior,¨ E. Pozzebon, and A. C. Sobieranski, “A survey on facial emotion recognition techniques: A state-of-the-art literature review,” Information Sciences, vol. 582, pp. 593–617, 2022.
  • [32] R. Pathak and Y. Singh, “Real time baby facial expression recognition using deep learning and iot edge computing,” in 2020 5th International conference on computing, communication and security (ICCCS), pp. 1– 6, IEEE, 2020.
  • [33] M. S. Hossain and G. Muhammad, “Emotion recognition using secure edge and cloud computing,” Information Sciences, vol. 504, pp. 589– 601, 2019.
  • [34] S. Barra, S. Hossain, C. Pero, and S. Umer, “A facial expression recognition approach for social iot frameworks,” Big Data Research, vol. 30, p. 100353, 2022.
  • [35] S. Trivedi, N. Patel, and N. Faruqui, “Ndnn based u-net: An innovative 3d brain tumor segmentation method,” in 2022 IEEE 13th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), pp. 0538–0546, IEEE, 2022.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa nr POPUL/SP/0154/2024/02 w ramach programu "Społeczna odpowiedzialność nauki II" - moduł: Popularyzacja nauki i promocja sportu (2025).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-69501f00-903c-4b88-b6a0-c0b628fae48f
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.