Tytuł artykułu
Treść / Zawartość
Pełne teksty:
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Sensor-based Human Activity Recognition (SHAR) technology is dedicated to utilizing sensor signals from smart devices to detect and identify human activities, thereby assisting in daily life. With the successful application of deep learning techniques, researchers are exploring the potential of integrating them with SHAR. Traditional fixed sliding window methods for processing datasets often lead to multi-class activity mixing. To alleviate this issue, researchers have introduced time attention mechanisms to focus on key temporal points related to activities. To address this challenge, we propose an innovative Multi-scale Time Segments Attention Mechanism (MTSA), which diverges from traditional time attention mechanisms by focusing on time segments pertinent to activities, better aligning with the characteristics of SHAR data and significantly reducing computational resource consumption. Our experiments on recognized datasets such as UCI-HAR, PAMAP2, and WISDM validate the effectiveness of MTSA, demonstrating that it can be seamlessly integrated into existing SHAR models, enhancing performance without adding extra computational overhead.
Czasopismo
Rocznik
Tom
Strony
1--17
Opis fizyczny
Bibliogr. 32 poz., rys., tab., wykr., wzory
Twórcy
autor
- Changzhou University, Changzhou 213164, China
autor
- Changzhou University, Changzhou 213164, China
autor
- Changzhou University, Changzhou 213164, China
autor
- Changzhou University, Changzhou 213164, China
autor
- Changzhou University, Changzhou 213164, China
Bibliografia
- [1] Xu, T., Se, H., & Liu, J. (2020). A two-step fall detection algorithm combining threshold-based method and convolutional neural network. Metrology and Measurement Systems, 28(1), 23-40. https://doi.org/10.24425/mms.2021.135999
- [2] Ding, H., Shangguan, L., Yang, Z., Han, J., Zhou, Z., Yang, P., Xi, W., & Zhao, J. (2015). FEMO: A Platform for Free-weight Exercise Monitoring with RFIDs. Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems, 141-154. https://doi.org/10.1145/2809695.2809708
- [3] Deep, S., & Zheng, X. (2019). Leveraging CNN and Transfer Learning for Vision-based Human Activity Recognition. 2019 29th International Telecommunication Networks and Applications Conference (ITNAC), 1-4. https://doi.org/10.1109/ITNAC46935.2019.9078016
- [4] Sun, Z., Ke, Q., Rahmani, H., Bennamoun, M., Wang, G., & Liu, J. (2022). Human Action Recognition from Various Data Modalities: A Review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1-20. https://doi.org/10.1109/TPAMI.2022.3183112
- [5] Jiao, W., & Zhang, C. (2023). An Efficient Human Activity Recognition System Using WiFi Channel State Information. IEEE Systems Journal, 17(4), 6687-6690. https://doi.org/10.1109/JSYST.2023.3293482
- [6] Yao, Q.-Y., Chen, P.-L., & Chen, T.-S. (2023). Human Activity Recognition Using 2-D LiDAR and Deep Learning Technology. IEEE Sensors Letters, 7(10), 1-4. https://doi.org/10.1109/LSENS.2023.3316882
- [7] Du, H., Wei, H., Ni, P., Feng, Z., Sun, S., Jiang, M., & Xu, G. (2023). Millimeter Wave Radar Human Activity Recognition with a Contrastive Learning Network. 2023 International Conference on Microwave and Millimeter Wave Technology (ICMMT), 1-3. https://doi.org/10.1109/ICMMT58241.2023.10277131
- [8] Liu, R., Ramli, A.A., Zhang, H., Henricson, E., & Liu, X. (2022). An Overview of Human Activity Recognition Using Wearable Sensors: Healthcare and Artificial Intelligence. In B. Tekinerdogan, Y. Wang, & L.-J. Zhang (Eds.), Internet of Things - ICIOT 2021 (pp. 1-14). Springer International Publishing. https://doi.org/10.1007/978-3-030-96068-1_1
- [9] Muangprathub, J., Sriwichian, A., Wanichsombat, A., Kajornkasirat, S., Nillaor, P., & Boonjing, V. (2021). A Novel Elderly Tracking System Using Machine Learning to Classify Signals from Mobile and Wearable Sensors. International Journal of Environmental Research and Public Health, 18(23), 12652. https://doi.org/10.3390/ijerph182312652
- [10] Bhuiyan, Mohd. S. H., Patwary, N. S., Saha, P. K., & Hossain, Md. T. (2020). Sensor-Based Human Activity Recognition: A Comparative Study of Machine Learning Techniques. 2020 2nd International Conference on Advanced Information and Communication Technology (ICAICT), 286-290. https://doi.org/10.1109/ICAICT51780.2020.9333470
- [11] Khan, R., Abbas, M., Anjum, R., Waheed, F., Ahmed, S., & Bangash, F. (2020). Evaluating Machine Learning Techniques on Human Activity Recognition Using Accelerometer Data. 2020 International Conference on UK-China Emerging Technologies (UCET), 1-6. https://doi.org/10.1109/UCET51115.2020.9205376
- [12] Meena, T., & Sarawadekar, K. (2023). Seq2Dense U-Net: Analyzing Sequential Inertial Sensor Data for Human Activity Recognition Using Dense Segmentation Model. IEEE Sensors Journal, 23(18), 21544-21552. https://doi.org/10.1109/JSEN.2023.3301187
- [13] Teng, Q., Tang, Y., & Hu, G. (2023). RepHAR: Decoupling Networks with Accuracy-Speed Tradeoff for Sensor-Based Human Activity Recognition. IEEE Transactions on Instrumentation and Measurement, 72, 1-11. https://doi.org/10.1109/TIM.2023.3240198
- [14] Yang, P., Yang, C., Lanfranchi, V., & Ciravegna, F. (2022). Activity Graph Based Convolutional Neural Network for Human Activity Recognition Using Acceleration and Gyroscope Data. IEEE Transactions on Industrial Informatics, 18(10), 6619-6630. https://doi.org/10.1109/TII.2022.3142315
- [15] Gao, W., Zhang, L., Huang, W., Min, F., He, J., & Song, A. (2021). Deep Neural Networks for Sensor-Based Human Activity Recognition Using Selective Kernel Convolution. IEEE Transactions on Instrumentation and Measurement, 70, 1-13. https://doi.org/10.1109/TIM.2021.3102735
- [16] Ishimaru, S., Hoshika, K., Kunze, K., Kise, K., & Dengel, A. (2017). Towards reading trackers in the wild: Detecting reading activities by EOG glasses and deep neural networks. 704-711. https://doi.org/10.1145/3123024.3129271
- [17] Yao, S., Hu, S., Zhao, Y., Zhang, A., & Abdelzaher, T. (2017). DeepSense: A Unified Deep Learning Framework for Time-Series Mobile Sensing Data Processing (arXiv:1611.01942). arXiv. https://doi.org/10.48550/arXiv.1611.01942
- [18] Zhang, J., Liu, Y., & Yuan, H. (2023). Attention-based Residual BiLSTM Networks for Human Activity Recognition. IEEE Access, PP, 1-1. https://doi.org/10.1109/ACCESS.2023.3310269
- [19] Ding, W., Abdel-Basset, M., & Mohamed, R. (2023). HAR-DeepConvLG: Hybrid deep learning-based model for human activity recognition in IoT applications. Information Sciences, 646, 119394. https://doi.org/10.1016/j.ins.2023.119394
- [20] Hu, J., Shen, L., & Sun, G. (2018). Squeeze-and-Excitation Networks. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7132-7141. https://doi.org/10.1109/CVPR.2018.00745
- [21] Duan, F., Zhu, T., Wang, J., Chen, L., Ning, H., & Wan, Y. (2023). A Multitask Deep Learning Approach for Sensor-Based Human Activity Recognition and Segmentation. IEEE Transactions on Instrumentation and Measurement, 72, 1-12. https://doi.org/10.1109/TIM.2023.3273673
- [22] Woo, S., Park, J., Lee, J.-Y., & Kweon, I.S. (2018). CBAM: Convolutional Block Attention Module. In V. Ferrari, M. Hebert, C. Sminchisescu, & Y. Weiss (Eds.), Computer Vision - ECCV 2018 (pp. 3-19). Springer International Publishing. https://doi.org/10.1007/978-3-030-01234-2_1
- [23] Vaswani, A., Shazeer, N. M., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017, June 12). Attention is All you Need. Neural Information Processing Systems. https://doi.org/10.48550/arXiv.1706.03762
- [24] Lu, L., & Deng, T. (2023). A Method of Self-Supervised Denoising and Classification for Sensor-Based Human Activity Recognition. IEEE Sensors Journal, 23(22), 27997-28011. https://doi.org/10.1109/JSEN.2023.3323314
- [25] Mekruksavanich, S., Jantawong, P., Phaphan, W., & Jitpattanakul, A. (2024). Hybrid Attention with CNN-BiLSTM and CBAM for Efficient Wearable Activity Recognition. 2024 Joint International Conference on Digital Arts, Media and Technology with ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering (ECTI DAMT & NCON), 572-576. https://doi.org/10.1109/ECTIDAMTNCON60518.2024.10480050
- [26] Wang, Y., Xu, H., Liu, Y., Wang, M., Wang, Y., Yang, Y., Zhou, S., Zeng, J., Xu, J., Li, S., & Li, J. (2023). A Novel Deep Multifeature Extraction Framework Based on Attention Mechanism Using Wearable Sensor Data for Human Activity Recognition. IEEE Sensors Journal, 23(7), 7188-7198. https://doi.org/10.1109/JSEN.2023.3242603
- [27] Zheng, G. (2021). A Novel Attention-Based Convolution Neural Network for Human Activity Recognition. IEEE Sensors Journal, 21(23), 27015-27025. https://doi.org/10.1109/JSEN.2021.3122258
- [28] Anguita, D., Ghio, A., Oneto, L., Parra, X., & Reyes-Ortiz, J.L. (2013). A Public Domain Dataset for Human Activity Recognition using Smartphones. The European Symposium on Artificial Neural Networks.
- [29] Reiss, A., & Stricker, D. (2012). Introducing a New Benchmarked Dataset for Activity Monitoring. 2012 16th International Symposium on Wearable Computers, 108-109. https://doi.org/10.1109/ISWC.2012.13
- [30] Kwapisz, J. R., Weiss, G. M., & Moore, S. (2011). Activity recognition using cell phone accelerometers. SIGKDD Explor., 12, 74-82. https://doi.org/10.1145/1964897.1964918
- [31] Li, Y., Wu, J., Fang, A., & Dong, W. (2023). Temporal-Spatial Dynamic Convolutional Neural Network for Human Activity Recognition Using Wearable Sensors. IEEE Transactions on Instrumentation and Measurement, PP, 1-1. https://doi.org/10.1109/TIM.2023.3279908
- [32] Zhang, N., Song, Y., Fang, D., Gao, Z., & Yan, Y. (2024). An Improved Deep Convolutional LSTM for Human Activity Recognition Using Wearable Sensors. IEEE Sensors Journal, 24(2), 1717-1729. https://doi.org/10.1109/JSEN.2023.3335213
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-b5b888cd-5c23-4e23-8742-7927fc7d2fa3
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.