PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Indoor positioning methods for small autonomous vehicles

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The estimation of their position by unmanned vehicles is crucial for successful reconnaissance, procurement and rescue missions in autonomous mode. Determining goals and mission checkpoints allows the platform operator, and also the decision-making algorithms, to take appropriate actions. The paper presents popular positioning methods in spaces where GNSS navigation signal is not available. The potential of using such technologies as UWB, US, INS or SLAM algorithms based on readings from LiDARs or cameras is discussed in the context of small land platforms that can perform specific tasks in an automated manner. The paper also discusses the solutions described in detail in the cited literature related to the subject of data fusion from various types of sensors, which ensures greater accuracy and reliability of the readings obtained.
Rocznik
Tom
Strony
93--107
Opis fizyczny
Bibliogr. 53 poz., rys.
Twórcy
  • Ośrodek Badawczo-Rozwojowy Urządzeń Mechanicznych "OBRUM" sp z o.o., Gliwice
  • Siles Labs Sp. z o.o., Gliwice
Bibliografia
  • [1] Fernandez-Madrigal J. A. et al.:“Application of UWB and GPS technologies for vehicle localization in combined indoor-outdoor environments”, 2007 9th International Symposium on Signal Processing and Its Applications, 2007.
  • [2] Henderson H. P., Bevly D. M.: “Relative position of UGVs in constrained environments using low cost IMU and GPS augmented with ultrasonic sensors”, 2008 IEEE/ION Position, Location and Navigation Symposium, pp. 1269-1277, 2008.
  • [3] Kołakowski J., Cichocki J., Makal P., Michnowski R.: “An Ultra-Wideband System for Vehicle Positioning”, Intl Journal of Electronics and Telecommunications, vol. 56, no. 3, pp. 247-256, 2010.
  • [4] Memon S., Memon M. M., Shaikh Faisal K., Laghari S.: “Smart indoor positioning using BLE technology”,4th IEEE International Conference on Engineering Technologies and Applied Sciences (ICETAS), 2017.
  • [5] S. Lu, C. Xu, R. Y. Zhong, L. Wang: “A RFID-enabled positioning system in automated guided vehicle for smart factories”, Journal of Manufacturing Systems, vol. 44, no. 1, pp. 179-190, July 2017.
  • [6] Cadena C.: “Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age”, IEEE Transactions on Robotics, vol. 32, no. 6, pp. 1309-1332, 2016.
  • [7] Gorostiza E. M. et al.: ”Infrared Sensor System for Mobile-Robot Positioning in Intelligent Spaces”, Sensors, vol. 11 no. 5, pp. 5416-5438, May 2011.
  • [8] Nawrat A., Jędrasiak K., Daniec K., Koteras R.: “Inertial Navigation Systems and Its Practical Applications”, New Approach of Indoor and Outdoor Localization Systems, pp. 213-240, 2011.
  • [9] Markowska-Prorok A. S.: “Models and Algorithms for Ultra-Wideband Localization in Single - and Multi-Robot Systems”, THÈSE NO 5746 (2013), Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland, 2013.
  • [10] Carter J.et al.: “Lidar 101: An Introduction to Lidar Technology, Data, and Applications”, National Oceanic and Atmospheric Administration, USA, November 2012.
  • [11] Durrant-Whyte H., Bailey T.: “Simultaneous localization and mapping: part I”, IEEE Robotics & Automation Magazine, vol. 13, no. 2, pp. 99-110, June 2006.
  • [12] Smith R. C., Cheeseman P.: “On the Representation and Estimation of Spatial Uncertainty”, The International Journal of Robotics Research, vol. 4, no. 4, pp. 56-68, 1986.
  • [13] Y. Wu, F. Tang, H. Li: “Image Based Camera Localization: an Overview”, Visual Computing for Industry, Biomedicine and Art, 2018.
  • [14] Trajkovic M., Hedley M.: “Fast Corner Detection”, Image and Vision Computing, vol. 16, no. 2, pp. 75-87, February 1998.
  • [15] H. Bay T. Tuytelaars, L. V. Gool: “SURF: Speeded Up Robust Features”, Computer Vision and Image Understanding, vol. 110, no. 3, pp. 346-359, June 2008.
  • [16] Lowe D. G.: “Distinctive Image Features from Scale-Invariant Keypoints”, International Journal of Computer Vision, vol. 60, no. 2, pp. 91-110, November 2004.
  • [17] Riisgaard S., Blas M. R.: “SLAM for Dummies A Tutorial Approach to Simultaneous Localization and Mapping”, MIT, USA, 2004.
  • [18] Thrun S., Burgard W., FoxD.: “Probabilistic robotics”, MIT, USA, August 2005.
  • [19] Montemerlo M., Thrun S., Koller D., Wegbreit B.: “Fast SLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem”, AAAI National Conference on Artificial Intelligence, pp. 593-598, 2002.
  • [20] Mur-Artal R., Montiel J. M. M., Tardos J. D.: “ORB-SLAM: a Versatile and Accurate Monocular SLAM System”, IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147- 1163, October 2015.
  • [21] Kriegman D. J., Triendl E., Binford T. O.: “Stereo vision and navigation in buildings for mobile robots”, IEEE Transactions on Robotics and Automation, vol. 5, no. 6, December 1989.
  • [22] Engel J., Stuckler J., Cremers D.: “Large-scale direct SLAM with stereo cameras”, IEEE/RSJ International Conference on Intelligent Robots and Systems, Germany 2015.
  • [23] Besl P. J., McKay N. D.: "Method for registration of 3-D shapes", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, February 1992.
  • [24] Hess W., Kohler D., Rapp H. and Andor D.: "Real-time loop closure in 2D LIDAR SLAM," IEEE International Conference on Robotics and Automation, pp. 1271-1278, Stockholm 2016.
  • [25] Waymo Safety Report, “On the Road to Fully Self-Driving”, Waymo, USA, 2018 [online] [dostęp: 10.01.2018], Dostępny w Internecie: https://waymo.com/safety/.
  • [26] Barsan I. A., Wang S., Pokrovsky A., Urtasun R.: “Learning to Localize Using a LiDAR Intensity Map”, Conference on Robot Learning, Switzerland, 2018.
  • [27] Caselitz T., Steder B., Ruhnke M., Burgard W.: “Monocular Camera Localization in 3D LiDAR Maps”, IEEE/RSJ International Conference on Intelligent Robots and Systems, 2016.
  • [28] Gutman J. S., Shlegel C.: “AMOS: comparison of scan matching approaches for selflocalization in indoor environments”, Euromicro Workshop on Advanced Mobile Robots, Germany, 1996.
  • [29] Harle R.: “A Survey of Indoor Inertial Positioning Systems for Pedestrians”, IEEE Communications Surveys & Tutorials, vol. 15, no. 3, pp. 1281-1293, 2013.
  • [30] Woodman O. J.: “An introduction to inertial navigation”, University of Cambridge, Computer Laboratory, Technical Report, no. 696, August 2007.
  • [31] Halvorsen K., Soderstrom T., Stokes V., Lanshammar H.: “Using an Extended Kalman Filter for Rigid Body Pose Estimation”, Journal of Biomechanical Engineering, vol. 127, pp. 475-483, June 2005.
  • [32] Marton L., Gyorgy K.: “Two-Stage Kalman Filtering for Indoor Localization of Omnidirectional Robots”, Electrical and Mechanical Engineering, vol. 5, pp. 44-60, 2013.
  • [33] Sabatini A. M.: “Quaternion-Based Extended Kalman Filter for Determining Orientation by Inertial and Magnetic Sensing”, IEEE Transactions On Biomedical Engineering, vol. 53, no. 7, pp. 1346-1356, July 2006.
  • [34] Madgwick S. O. H.: “An efficient orientation filter for inertial and inertial/magnetic sensor arrays”, April 2010.
  • [35] Sewio, “See the value of using indoor tracking in Volkswagen”, 2017 [online], [dostęp: 7.01.2019], Dostępny w Internecie: https://www.sewio.net/customer-projects/ volkswagen/.
  • [36] Sewio, “Indoor Tracking of Event Visitors at IBM Summit”, 2017 [online], [dostęp: 7.01.2019], Dostępny w Internecie: https://www.sewio.net/customer-projects/ibm.
  • [37] Sonitor, “Sanford Medical Center Fargo deploys Sonitor's Sense ultrasound-based RTLS platform”, 2017 [online], [dostęp: 7.01.2019], Dostępny w Internecie: https://www.sonitor.com/articles/2018/5/6/sanford-medical-center-fargo-deployssonitors-sense-ultrasound-based-rtls-platform.
  • [38] Microsoft, “Microsoft Indoor Localization Competition – IPSN 2018”, 2018 [online], [dostęp: 7.01.2019], Dostępny w Internecie:https://www.microsoft.com/en-us/ research/event/microsoft-indoor-localization-competition-ipsn-2018/.
  • [39] Oppermann I., Hamalainen M., Iinatti J.: “UWB: Theory and Applications”, John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, 2004.
  • [40] Chang S., Wolf M., Burdick Joel W.: “Human detection and tracking via UltraWideband (UWB) radar”, IEEE International Conference on Robotics and Automation, May 2010.
  • [41] Chen Z., Liu Y., Li S., Wang G.: “Study on the multipath propagation characteristics of UWB signal for indoor lab environments”, IEEE International Conference on Ubiquitous Wireless Broadband (ICUWB), December 2016.
  • [42] Gentile C. et al.: ”Multipath and NLOS Mitigation Algorithms”, Geolocation Techniques,Springer, New York, 2013.
  • [43] Song Z., Jiang G., Huang C.: “A Survey on Indoor Positioning Technologies”, Theoretical and Mathematical Foundations of Computer Science, Springer, pp. 198- 206, Heidelberg, Germany, 2011.
  • [44] Ijaz F., Yang H. K., Ahmad A. W., Lee Ch.: “Indoor Positioning: A Review of Indoor Ultrasonic Positioning systems”, 15th International Conference on Advanced Communications Technology (ICACT), March 2013.
  • [45] Carter D. J., Silva B. J., Qureshi U. M., Hancke G. P.: “An Ultrasonic Indoor Positioning System for Harsh Environments”, 44th Annual Conference of the IEEE Industrial Electronics Society, pp. 5215-5220, 2018.
  • [46] McCarthy M. R., Muller H. L.: “RF Free Ultrasonic Positioning”, Seventh IEEE International Symposium on Wearable Computers, pp. 79-85, 2003.
  • [47] Holm S.: “Ultrasound positioning based on time-of-flight and signal strength”, International Conference on Indoor Positioning and Indoor Navigation, November 2012.
  • [48] Kelly J., Sukhatme G. S.: “Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration”, The International Journal of Robotics Research, vol. 30, no. 1, pp. 56-70, January 2011.
  • [49] Kumar G. A. et al.: “A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification”, Sensors, vol. 17, no. 6, June 2017.
  • [50] Shin Y. S., Park Y. S., Kim A.: “Direct Visual SLAM using Sparse Depth for CameraLiDAR System”, IEEE International Conference on Robotics and Automation, May 2018.
  • [51] Benzerrouk H., Nebylov A. V.:“Robust IMU/UWB integration for indoor pedestrian navigation”, 25th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS), 2018.
  • [52] Yao L., Wu Y. W. A., Yao L., Liao Z. Z.: “An integrated IMU and UWB sensor based indoor positioning system”, International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2017.
  • [53] Miraglia G., Maleki K. N., Hook L. R.: “Comparison of two sensor data fusion methods in a tightly coupled UWB/IMU 3-D localization system”, International Conference on Engineering, Technology and Innovation, pp. 611-618, 2017.
Uwagi
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2019).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-9d2a9b01-9659-49e2-8bff-898b4c2c1123
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.