PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

A lane tracking algorithm for low-computational-power microcontroller-controlled autonomous vehicle models

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
At work, three tasks were presented: road lane detection and trajectory estimation, environment mapping, and the application of a neural network. All these tasks are based on the results of the lane detection method. The presented lane detection method stands out due to the execution of an interpolation transformation for all previously detected edge points. This transformation transfers these points to a “bird’s-eye” coordinate system and distributes them on a grid. Road lanes are identified by a lane feature filter based on the analysis of the distances between unique points. This allows lane views to be obtained in a coordinate system while preserving the distance condition. The road environment map is constructed from the obtained images using a probabilistic algorithm called Distributed Particle-SLAM (DP-SLAM). Based on the map result, a method for representing characteristic points describing the path of road lanes in each incoming camera image has been developed. These points are then used for training the neural network. The neural network solves a regression task for the coordinates of the points on the road lanes, enabling the identification of coefficients for parabolic fitting. Validation has been performed.
Czasopismo
Rocznik
Strony
43--56
Opis fizyczny
Bibliogr. 29 poz.
Twórcy
  • Warsaw University of Technology, Faculty of Transport, Koszykowa 75, Warsaw, 00-662, Poland
  • Warsaw University of Technology, Faculty of Transport, Koszykowa 75, Warsaw, 00-662, Poland
autor
  • Warsaw University of Technology, Faculty of Transport, Koszykowa 75, Warsaw, 00-662, Poland
Bibliografia
  • 1. Choromański, W. & Grabarek, I. & Kozłowski, M. & Czerepicki, A. & Marczuk, K. Pojazdy autonomiczne i systemy transportu autonomicznego. Wydawnictwo Naukowe PWN. 2020. [In Polish: Autonomous vehicles and autonomous transport systems. PWN Scientific Publishing House].
  • 2. SAE Levels of Driving Automation™ Refined for Clarity and International Audience. Available at: https://www.sae.org/blog/sae-j3016-update.
  • 3. Dey, K. & Rayamajhi, A. & Chowdhury, M. & Bhavsar, P. & Martin, J. Vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication in a heterogeneous wireless network - Performance evaluation. Transportation Research Part C: Emerging Technologies. 2016. Vol. 68. P. 168-184. DOI: 10.1016/j.trc.2016.03.008.
  • 4. Apostoloff, N. & Zelinsky A. Robust vision based lane tracking using multiple cues and particle filtering. IEEE IV2003 Intelligent Vehicles Symposium. Proceedings (Cat. No.03TH8683). Columbus, OH, USA. 2003. P. 558-563. DOI: 10.1109/IVS.2003.1212973.
  • 5. Wang, Y. & Teoh, E. & Shen, D. Lane detection and tracking using B-Snake. Image and Vision Computing. 2004. Vol. 22. P. 269-280. DOI: 10.1016/j.imavis.2003.10.003.
  • 6. Forogh, P. Line Detection in Python OpenCV with HoughLines. Available at: https://www.youtube.com/watch?v=OchCsSiffeE.
  • 7. Aly, M. Real time detection of lane markers in urban streets. 2008 IEEE Intelligent Vehicles Symposium. Eindhoven, Netherlands. 2008. P. 7-12. DOI: 10.1109/IVS.2008.4621152.
  • 8. Ding, L. & Zhang, H. & Xiao, J. & Shu, C., & Lu, S. A lane detection method based on semantic segmentation. Comput. Model. Eng. Sci. 2020. Vol. 122(3). P. 1039-1053.
  • 9. Ko, Y. & Lee, Y. & Azam, S. & Munir, F. & Jeon, M. & Pedrycz, W. Key Points Estimation and Point Instance Segmentation Approach for Lane Detection. In: IEEE Transactions on Intelligent Transportation Systems. 2022. Vol. 23. No. 7. P. 8949-8958. DOI: 10.1109/TITS.2021.3088488.
  • 10. Neven, D. & De Brabandere, B. & Georgoulis, S. & Proesmans, M. & Van Gool, L. Towards end- to-end lane detection: an instance segmentation approach. In: 2018 IEEE intelligent vehicles symposium. 2018. P. 286-291.
  • 11. Getahun, T. Lane Detection for Autonomous Driving: Conventional and CNN approaches. Access Laboratory. Available at: https://www.youtube.com/watch?v=xIRT3rgWrFQ.
  • 12. ChanHee, J. Curved Lane Detection_Algorithm Explanation in English. Available at: https://www.youtube.com/watch?v=0RAijzUnQAU.
  • 13. Weng, J. & Cohen, P. & Herniou, M. Camera calibration with distortion models and accuracy evaluation. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1992. Vol. 14(10). P. 965-980.
  • 14. Dulari, B. A comprehensive guide for Camera calibration in computer vision. Available at: https://www.analyticsvidhya.com/blog/2021/10/a-comprehensive-guide-for-camera-calibration-in- computer-vision/.
  • 15. Kutila, M. & Korpinen, J. & Viitanen, J. Camera calibration in machine automation. Human Friendly Mechatronics. Elsevier Science. 2001. P. 211-216. ISBN: 9780444506498. Available at: 10.1016/B978-044450649-8/50036-X.
  • 16. Jiang, K. Calibrate fisheye lens using OpenCV - part 1, part 2. 2017. Available at: https://medium.com/@kennethjiang/calibrate-fisheye-lens-using-opencv-333b05afa0b0. https://medium.com/@kennethjiang/calibrate-fisheye-lens-using-opencv-part-2-13990f1b157f.
  • 17. OpenCV documentation for camera calibration. Available at: https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html.
  • 18. Kausthub Sadekar. Understanding Lens Distortion. Available at: https://learnopencv.com/understanding-lens-distortion/.
  • 19. MathWorks. What Is Camera Calibration? Available at: https://in.mathworks.com/help/vision/ug/camera-calibration.html.
  • 20. Lenton, D. Part I & II: Projective Geometry in 2D. Available at: https://medium.Com/@unifyai/part-i-projective-geometry-in-2d-b1ca26d5fa2a. https://medium.Com/@unifyai/part-ii-projective-transformations-in-2d-2e99ac9c7e9f.
  • 21. Stone, J.V. Bayes’ Rule: A Tutorial Introduction to Bayesian Analysis. Available at: http://jimstone.staff.shef.ac.uk/.
  • 22. Mogensen, L. & Andersen, N.A. & Ravn, O. & Poulsen, N. Using Kalmtool in Navigation of Mobile Robots. 2023. Available at: https://www.researchgate.net/publication/269262121_Kalmtool_Used_for_Mobile_Robot_Naviga tion.
  • 23. Eliazar, A. & Parr, R. DP-SLAM: Fast, Robust Simultaneous Localization and Mapping Without Predetermined Landmarks. Available at: http://people.ee.duke.edu/~lcarin/Lihan4.21.06a.pdf.
  • 24. Biber, P. & Strasser, W. The normal distributions transform: A new approach to laser scan matching. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2003. P. 2743-2748.
  • 25. Tensorflow.org. Transfer learning and fine-tuning. TensorFlow Core. Available at: https://www.tensorflow.org/tutorials/images/transfer_learning.
  • 26. Baheti, P. A Newbie-Friendly Guide to Transfer Learning. Available at: https://www.v7labs.com/blog/transfer-learning-guide.
  • 27. MathWorks. Transfer Learning for Training Deep Learning Models. Available at: https://www.mathworks.com/discovery/transfer-learning.html.
  • 28. Nehemiah, A. Deep Learning for Automated Driving (Part 2) - Lane Detection. Available at: https://blogs.mathworks.com/deep-learning/2017/11/17/deep-learning-for-automated-driving-part- 2-lane-detection/.
  • 29. MathWorks. Automated Driving Toolbox, Design, simulate, and test ADAS and autonomous driving systems. Available at: https://www.mathworks.com/products/automated-driving.html.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-eb52a587-466f-4e07-b571-7e56a2d0813e
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.