PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Rat robot motion state identification based on a wearable inertial sensor

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Rat robots have great potential in rescue and search tasks because of their excellent motion ability. However, most of the current rat-robot systems relay on human guidance due to variable voluntary motor behaviour of rats, which limits their application. In this study, we developed a real-time system to detect a rat robot’s transient motion states, as the prerequisite for further study of automatic navigation. We built the detection model by using a wearable inertial sensor to capture acceleration and angular velocity data during the control of a rat robot. Various machine learning algorithms, including Decision Trees, Random Forests, Logistic Regression, and Support Vector Machines, were employed to perform the classification of motion states. This detection system was tested in manual navigation experiments, with detection accuracy achieving 96.70%. The sequence of transient motion states could be further used as a promising reference for offline behaviour analysis.
Rocznik
Strony
255--268
Opis fizyczny
Bibliogr. 37 poz., fot., rys., tab., wykr.
Twórcy
autor
  • Zhejiang University, State Key Laboratory of Fluid Power and Mechatronic Systems, Hangzhou, China
autor
  • Zhejiang University, Qiushi Academy for Advanced Studies (QAAS), Hangzhou, China
  • Zhejiang University, Key Laboratory of Biomedical Engineering of Education Ministry, Hangzhou, China
autor
  • Zhejiang University, State Key Laboratory of Fluid Power and Mechatronic Systems, Hangzhou, China
  • Zhejiang University, Ningbo Research Institute, Ningbo, China
autor
  • Zhejiang University, State Key Laboratory of Fluid Power and Mechatronic Systems, Hangzhou, China
  • Zhejiang University, Ningbo Research Institute, Ningbo, China
autor
  • Zhejiang University, Qiushi Academy for Advanced Studies (QAAS), Hangzhou, China
  • Zhejiang Lab, Hangzhou, China
Bibliografia
  • [1] Latif, T., & Bozkurt, A. (2017). Roach Biobots: Toward Reliability and Optimization of Control. IEEE Pulse, 8(5), 27-30. https://doi.org/10.1109/mpul.2017.2729413
  • [2] Talwar, S. K., Xu, S., Hawley, E. S., Weiss, S. A., Moxon, K. A., & Chapin, J. K. (2002). Rat Navigation Guided by Remote Control. Nature, 417(6884), 37-38. https://doi.org/10.1038/417037a
  • [3] Koo, B., Koh, C. S., Park, H., Lee, H., Chang, J. W., Choi, S., & Shin, H. (2017). Manipulation of Rat Movement via Nigrostriatal Stimulation Controlled by Human Visually Evoked Potentials. Scientific Reports, 7(1), 2340-2347. https://doi.org/10.1038/s41598-017-02521-6
  • [4] Cai, L., Dai, Z., Wang, W., Wang, H., & Tang, Y. (2015). Modulating Motor Behaviors by Electrical Stimulation of Specific Nuclei in Pigeons. Journal of Bionic Engineering, 12(4), 555-564. https://doi.org/10.1016/S1672-6529(14)60145-1
  • [5] Zhang, C., Liu, J., Tian, H., Kang, X., Rui, Y., Yang, B., Zhu, H., & Yang, C. (2013). Control of Swimming in Crucian Carp: Stimulation of the Brain Using an ImplantableWire Electrode. The 8th Annual IEEE International Conference on Nano/Micro Engineered and Molecular Systems, China, 360-363. https://doi.org/10.1109/NEMS.2013.6559751
  • [6] Bozkurt, A., Lobaton, E., & Sichitiu, M. (2016). A Biobotic Distributed Sensor Network for Under Rubble Search and Rescue. Computer, 49(5), 38-46. https://doi.org/10.1109/MC.2016.136
  • [7] Feng, Z., Chen, W., Ye, X., Zhang, S., Zheng, X., Wang, P., Jiang, J., Jin, L., Xu, Z., Liu, C., Liu, F., Luo, J., Zhuang, Y., & Zheng, X. (2007). A Remote Control Training System for Rat Navigation in Complicated Environment. Journal of Zhejiang University - SCIENCE A, 8(2), 323-330. https://doi.org/10.1631/jzus.2007.A0323
  • [8] Zhang, X., Sun, C., Zheng, N., Chen, W., & Zheng, X. (2012). Motion States Extraction with Optical Flow for Rat-Robot Automatic Navigation. 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, USA, 976-979. https://doi.org/10.1109/EMBC.2012.6346096
  • [9] Camalan, S., Sengul, G., Misra, S., Maskeliunas, R., & Damasevicius, R. (2018). Gender Detection Using 3D Anthropometric Measurements by Kinect. Metrology and Measurement Systems, 25(2), 253-267. https://doi.org/10.24425/119568
  • [10] Wodziński, M., & Krzyżanowska, A. (2017). Sequential Classification of Palm Gestures Based on A* Algorithm and MLP Neural Network for Quadrocopter Control. Metrology and Measurement Systems, 24(2), 265-276. https://doi.org/10.1515/mms-2017-0021
  • [11] Wang, D., Dong, Y., Li, Q., Wu, J., & Wen, Y. (2018). Estimation of Small Uav Position and Attitude with Reliable In-Flight Initial Alignment for MEMS Inertial Sensors. Metrology and Measurement Systems, 25(3), 603-616. https://doi.org/10.24425/123904
  • [12] den Uijl, I., Gómez Álvarez, C. B., Bartram, D., Dror, Y., Holland, R., & Cook, A. (2017). External Validation of a Collar-Mounted Triaxial Accelerometer for Second-By-Second Monitoring of Eight Behavioural States in Dogs. Plos One, 12(11), e188481. https://doi.org/10.1371/journal.pone.0188481
  • [13] Valentin, G., Alcaidinho, J., Howard, A. M., Jackson, M. M., & Starner, T. E. (2015). Towards a Canine-Human Communication System Based on Head Gestures. The 12th International Conference ACM, USA, 1-6. https://doi.org/10.1145/2832932.2837016
  • [14] Wang, Y., Lu, M., Wu, Z., Tian, L., Xu, K., Zheng, X., & Pan, G. (2015). Visual Cue-Guided Rat Cyborg for Automatic Navigation. IEEE Computational Intelligence Magazine, 10(2), 42-52. https://doi.org/0.1109/MCI.2015.2405318
  • [15] Gjoreski, H., Lustrek, M., & Gams, M. (2011). Accelerometer Placement for Posture Recognition and Fall Detection. 2011 Seventh International Conference on Intelligent Environments, UK, 47-54. https://doi.org/10.1109/IE.2011.11
  • [16] Patel, M., Pavic, A., & Goodwin, V. A. (2020). Wearable Inertial Sensors to Measure Gait and Posture Characteristic Differences in Older Adult Fallers and Non-Fallers: A Scoping Review. Gait & Posture, 76, 110-121. https://doi.org/10.1016/j.gaitpost.2019.10.039
  • [17] Nathan, R., Spiegel, O., Fortmann-Roe, S., Harel, R., Wikelski, M., & Getz, W. M. (2012). Using Tri-Axial Acceleration Data to Identify Behavioral Modes of Free-Ranging Animals: General Concepts and Tools Illustrated for Griffon Vultures. Journal of Experimental Biology, 215(6), 986-996. https://doi.org/10.1016/10.1242/jeb.058602
  • [18] Graf, P. M., Wilson, R. P., Qasem, L., Hackländer, K., & Rosell, F. (2015). The Use of Acceleration to Code for Animal Behaviours; A Case Study in Free-Ranging Eurasian Beavers Castor fiber. Plos One, 10(8), e136751. https://doi.org/10.1371/journal.pone.0136751
  • [19] Majikes, J., Brugarolas, R., Winters, M., Yuschak, S., Mealin, S., Walker, K., Yang, P., Sherman, B., Bozkurt, A., & Roberts, D. L. (2017). Balancing Noise Sensitivity, Response Latency, and Posture Accuracy for a Computer-Assisted Canine Posture Training System. International Journal of Human-Computer Studies, 98, 179-195. https://doi.org/10.1016/j.ijhcs.2016.04.010
  • [20] Dutta, A. (2019). Cyborgs: Neuromuscular Control of Insects. 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), USA, 682-685. https://doi.org/10.1109/NER.2019.8717133
  • [21] Cole, J., Mohammadzadeh, F., Bollinger, C., Latif, T., Bozkurt, A., & Lobaton, E. (2017). A Study on Motion Mode Identification for Cyborg Roaches. ICASSP 2017, USA, 2652-2656. https://doi.org/10.1109/ICASSP.2017.7952637
  • [22] Yu, Y., Pan, G., Gong, Y., Xu, K., Zheng, N., Hua, W., Zheng, X., & Wu, Z. (2016). Intelligence-Augmented Rat Cyborgs in Maze Solving. Plos One, 11(2), e147754. https://doi.org/10.1371/journal.pone.0147754
  • [23] Carroll, G., Slip, D., Jonsen, I., & Harcourt, R. (2014). Supervised Accelerometry Analysis Can Identify Prey Capture by Penguins at Sea. Journal of Experimental Biology, 217(24), 4295-4302. https://doi.org/10.1242/jeb.113076
  • [24] Watanabe, S., Izawa, M., Kato, A., Ropert-Coudert, Y., & Naito, Y. (2005). A New Technique for Monitoring the Detailed Behaviour of Terrestrial Animals: A Case Study with the Domestic Cat. Applied Animal Behaviour Science, 94(1-2), 117-131. https://doi.org/10.1016/j.applanim.2005.01.010
  • [25] Martiskainen, P., Järvinen, M., Skön, J., Tiirikainen, J., Kolehmainen, M., & Mononen, J. (2009). Cow Behaviour Pattern Recognition Using a Three-Dimensional Accelerometer and Support Vector Machines. Applied Animal Behaviour Science, 119(1-2), 32-38. https://doi.org/10.1016/10.1016/j.applanim.2009.03.005
  • [26] Capela, N. A., Lemaire, E. D., & Baddour, N. (2015). Feature Selection for Wearable Smartphone-Based Human Activity Recognition with Able bodied, Elderly, and Stroke Patients. Plos One, 10(4), e124414. https://doi.org/10.1371/journal.pone.0124414
  • [27] Ladds, M. A., Thompson, A. P., Slip, D. J., Hocking, D. P., & Harcourt, R. G. (2016). Seeing It All: Evaluating Supervised Machine Learning Methods for the Classification of Diverse Otariid Behaviours. Plos One, 11(12), e166898. https://doi.org/10.1371/journal.pone.0166898
  • [28] Breiman, L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and Regression Trees. Wadsworth and Brooks.
  • [29] Breiman, L. (2001). Random Forests. Machine Learning, 1(45), 5-32. https://doi.org/10.1023/A:1010933404324
  • [30] Hosmer, D. W., & Lemeshow, S. (2000). Applied Logistic Regression. Wiley-Interscience. https://doi.org/10.1002/0471722146
  • [31] Cortes, C., & Vapnik, V. (1995). Support-Vector Networks. Machine Learning, 3(20), 273-297. https://doi.org/10.1007/BF00994018
  • [32] Wang, Y., Nickel, B., Rutishauser, M., Bryce, C. M., Williams, T. M., Elkaim, G., & Wilmers, C. C. (2015). Movement, Resting, and Attack Behaviors of Wild Pumas are Revealed by Tri-Axial Accelerometer Measurements. Movement Ecology, 3(1), 2. https://doi.org/10.1186/s40462-015-0030-0
  • [33] Peng, Y., Kondo, N., Fujiura, T., Suzuki, T., Wulandari, Yoshioka, H., & Itoyama, E. (2019). Classification of Multiple Cattle Behavior Patterns Using a Recurrent Neural Network with Long Short-Term Memory and Inertial Measurement Units. Computers and Electronics in Agriculture, 157, 247-253. https://doi.org/10.1016/j.compag.2018.12.023
  • [34] McClune, D. W., Marks, N. J., Wilson, R. P., Houghton, J. D., Montgomery, I. W., McGowan, N. E., Gormley, E., & Scantlebury, M. (2014). Tri-Axial Accelerometers Quantify Behaviour in the Eurasian Badger (Meles meles): Towards an Automated Interpretation of Field Data. Animal Biotelemetry, 2(1), 5. https://doi.org/10.1186/2050-3385-2-5
  • [35] Kuppers, F., Albers, J., & Haselhoff, A. (2019). Random Forest on an Embedded Device for Real-Time Machine State Classification. 2019 27th European Signal Processing Conference (EUSIPCO), Spain. https://doi.org/10.23919/EUSIPCO.2019.8902993
  • [36] Chereshnev, R., & Attila, K. (2018). RapidHARe: A Computationally Inexpensive Method for Real-Time Human Activity Recognition from Wearable Sensors. Journal of Ambient Intelligence and Smart Environments, 5(10), 377-391.
  • [37] Sun, C., Zhang, X., Zheng, N., Chen, W., & Zheng, X. (2012). Bio-Robots Automatic Navigation with Electrical Reward Stimulation. 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, USA, 348-351. https://doi.org/10.1109/EMBC.2012.6345940
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-609435b5-b8e4-41ba-9640-0f43b2577dab
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.