Research Article | | Peer-Reviewed

Indoor Positioning of AGVs Based on Multi-Sensor Data Fusion Such as LiDAR

Received: 19 February 2024    Accepted: 29 February 2024    Published: 20 March 2024
Views:       Downloads:
Abstract

In recent years, with the rapid growth in technology and demand for industrial robots, Automated Guided Vehicles (AGVs) have found widespread application in industrial workshops and smart logistics, emerging as a global hot research topic. Due to the volatile and complex working environments, the positioning technology of AGV robots is of paramount importance. To address the challenges associated with AGV robot positioning, such as significant accumulated errors in wheel odometer and Inertial Measurement Unit (IMU), susceptibility of Ultra Wide Band (UWB) positioning accuracy to Non Line of Sight (NLOS) errors, as well as the distortion points and drift in point clouds collected by LiDAR during robot motion, a novel positioning method is proposed. Initially, Weighted Extended Kalman Filter (W-EKF) is employed for the loosely coupled integration of wheel odometer and Ultra Wide Band (UWB) data, transformed into W-EKF pose factors. Subsequently, appropriate addition of W-EKF factors is made during the tight coupling of pre-integrated Inertial Measurement Unit (IMU) with 3D-LiDAR to counteract the distortion points, drift, and accumulated errors generated by LiDAR, thereby enhancing positioning accuracy. After experimentation, the algorithm achieved a final positioning error of only 6.9cm, representing an approximately 80% improvement in positioning accuracy compared to the loosely coupled integration of the two sensors.

Published in International Journal of Sensors and Sensor Networks (Volume 12, Issue 1)
DOI 10.11648/j.ijssn.20241201.12
Page(s) 13-22
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

AGV, Indoor Positioning, W-EKF, Data Fusion

References
[1] Zhou Zhiguo, Cao Jiangwei, Di Shunfan. Overview of 3D LiDAR SLAM algorithm [J]. Chinese Journal of Instrument, 2021, 42(09): 13-27. https://doi.org/10.19650/j.cnki.cjsi.J2107897
[2] Gong Zhiqiang, Xu Shixu, Wang Pengcheng. Design and research of inspection robot system based on ROS [J]. Automation and instrumentation, 2022, 37(4): 51-54+80. https://doi.org/10.19557/j.cnki.1001-9944.2022.04.011
[3] BeiMing Y, Wei C, Yong L, et al. Joint activity recognition and indoor localization with WiFi sensing based on multi-view fusion strategy [J]. Digital Signal Processing, 2022, 129.
[4] G G P, Martin W, Dionisio A, et al. Potential use of ground-based sensor technologies for weed detection [J]. Pest management science, 2014, 70(2): 190-9. https://doi.org/10.1002/ps.3677
[5] Chen Yuanyuan, Chen Jing, Zhang Shouxing. Discussion on the research status of AGV navigation technology [J]. Machinery management development, 2020, 35(5): 2. https://doi.org/10.16525/j.cnki.cn14-1134/th.2020.05.107
[6] Yao Ming, Duan Jinhao, Shao Zhufeng, et al. Lidar location and path tracking for single-wheel AGV forklift truck [J]. Journal of Tsinghua University (Natural Science Edition), 2024, 64(01): 117-29. https://doi.org/10.16511/j.cnki.qhdxxb.2023.21.022
[7] Shang Wenwu. Research on indoor positioning method of mobile car based on multi-sensor fusion [D]. Hanzhou Electronic Science and Technology University, 2020. https://doi.org/10.27075/d.cnki.ghzdc.2020.000232
[8] Ryck D M, Versteyhe M, Debrouwere F. Automated guided vehicle systems, state-of-the-art control algorithms and techniques [J]. Journal of Manufacturing Systems, 2020, 54152-173. https://doi.org/10.1109/TASE.2008.917015
[9] Zhang Yushuai. Research on the key technology and application of multi-sensor fusion in indoor positioning and navigation [D]. Xijing University, 2022. https://doi.org/10.27831/d.cnki.gxjxy.2021.000073
[10] Kumar S, Hegde M R. Multi-sensor data fusion methods for indoor localization under collinear ambiguity [J]. Pervasive and Mobile Computing, 2016, 3018-31.
[11] S. I, C. C, A. Z, et al. Fault tolerant multi-sensor data fusion for autonomous navigation in future civil aviation operations [J]. Control Engineering Practice, 2022, 123. https://doi.org/10.1016/j.conengprac.2022.105132
[12] Alzubi A A, Alarifi A, Al-Maitah M, et al. Multi-sensor information fusion for Internet of Things assisted automated guided vehicles in smart city [J]. Sustainable Cities and Society, 2021, 64(1): 102539. https://doi.org/10.1016/j.scs.2020.102539
[13] Bader K, Lussier B, Schön W. A fault tolerant architecture for data fusion: A real application of Kalman filters for mobile robot localization [J]. Robotics and Autonomous Systems, 2017, 88: 11-23. https://doi.org/10.1016/j.robot.2016.11.015
[14] Li H, Ao L, Guo H, et al. Indoor multi-sensor fusion positioning based on federated filtering [J]. Measurement, 2020, 154: 107506.
[15] Xu Aigong, Meng Xianghe, Gao Song, et al. UWB/LiDAR compact combination indoor positioning method [J]. Surveying and mapping science, 2022, 47(04): 1-9+18. https://doi.org/10.16251/j.cnki.1009-2307.2022.04.001
[16] Wen Gang, Zhou Fangrong, Li Tao, et al. LINS-GNSS: Filtering and optimization coupling GNSS/INS/LiDAR inspection robot positioning method [J]. Journal of Nanjing University of Information Science and Technology, 2023, 15(01): 85-93. https://doi.org/10.13878/j.cnki.jnuist.2023.01.009
[17] Zhao X, Min H, Xu Z, et al. An ISVD and SFFSD-based vehicle ego-positioning method and its application on indoor parking guidance [J]. Robotics and Autonomous Systems, 2019, 108: 29-48.
[18] Zhang Fubin, Wang Kai, Liao Weifei, et al. Lidar /MEMS IMU/ odometer compact integrated navigation algorithm [J]. Journal of Instrumentation, 2022, 43(07): 139-148. https://doi.org/10.19650/j.cnki.cjsi.J2209599
[19] H. T. Warku, N. Y. Ko, H. G. Yeom and W. Choi. Three-Dimensional Mapping of Indoor and Outdoor Environment Using LIO-SAM [J]. Proceedings of the International Congress of the Society for Controlled Robotic Systems, 2021: 1455-1458.
[20] Wu Jiayang, Huang Shihong, Yang Yanxu, et al. Evaluation of 3D LiDAR SLAM algorithms based on the KITTI dataset [J]. The Journal of Supercomputing, 2023, 79(14): 15760-15772.
[21] Liu, J, Liang, Y, Xu, D, et al. A ubiquitous positioning solution of integrating GNSS with LiDAR odometry and 3D map for autonomous driving in urban environments [J]. Journal of Geodesy, 2023, 97(4). https://doi.org/10.1109/ECMR50962.2021.9568821
[22] Jiao J, Deng Z, Arain A Q, et al. Smart Fusion of Multi-sensor Ubiquitous Signals of Mobile Device for Localization in GNSS-Denied Scenarios [J]. Wireless Personal Communications, 2018, 116(3): 1-17. https://doi.org/10.1007/s11277-018-5725-2
[23] Bo Y, Andrea G, Enrico P. A Track-Before-Detect Algorithm for UWB Radar Sensor Networks [J]. Signal Processing, 2021, 189 (prepublish): 108257. https://doi.org/10.48550/arXiv.2108.00501
[24] Li Jingwen, Wei Jingshan, Zhou Junfen, et al. Improved indoor positioning method based on UWB+PDR [J]. Surveying and mapping bulletin, 2022, (03): 36-40. https://doi.org/10.13474/j.cnki.11-2246.2022.0074
[25] Shen Yue, Xiao Xinye, Liu Hui, et, al. A LiDAR/IMU tightly coupled real-time positioning and mapping method for orchard robot [J]. Transactions of Agricultural Machinery, 2023, 54(11): 20-28+48.
[26] Du Xin, Zhu Wenliang, Wen Xiqin, Zhu Jiahao. Research onmultisensor fusion positioning method based on ultra wideband communication technology [J]. Science and Technology Innovation, 2022(07): 5-8. https://doi.org/10.19695/j.cnki.cn12-1369.2021.12.08
Cite This Article
  • APA Style

    Zhu, W., Guo, S. (2024). Indoor Positioning of AGVs Based on Multi-Sensor Data Fusion Such as LiDAR. International Journal of Sensors and Sensor Networks, 12(1), 13-22. https://doi.org/10.11648/j.ijssn.20241201.12

    Copy | Download

    ACS Style

    Zhu, W.; Guo, S. Indoor Positioning of AGVs Based on Multi-Sensor Data Fusion Such as LiDAR. Int. J. Sens. Sens. Netw. 2024, 12(1), 13-22. doi: 10.11648/j.ijssn.20241201.12

    Copy | Download

    AMA Style

    Zhu W, Guo S. Indoor Positioning of AGVs Based on Multi-Sensor Data Fusion Such as LiDAR. Int J Sens Sens Netw. 2024;12(1):13-22. doi: 10.11648/j.ijssn.20241201.12

    Copy | Download

  • @article{10.11648/j.ijssn.20241201.12,
      author = {Wen-liang Zhu and Shu-kai Guo},
      title = {Indoor Positioning of AGVs Based on Multi-Sensor Data Fusion Such as LiDAR},
      journal = {International Journal of Sensors and Sensor Networks},
      volume = {12},
      number = {1},
      pages = {13-22},
      doi = {10.11648/j.ijssn.20241201.12},
      url = {https://doi.org/10.11648/j.ijssn.20241201.12},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijssn.20241201.12},
      abstract = {In recent years, with the rapid growth in technology and demand for industrial robots, Automated Guided Vehicles (AGVs) have found widespread application in industrial workshops and smart logistics, emerging as a global hot research topic. Due to the volatile and complex working environments, the positioning technology of AGV robots is of paramount importance. To address the challenges associated with AGV robot positioning, such as significant accumulated errors in wheel odometer and Inertial Measurement Unit (IMU), susceptibility of Ultra Wide Band (UWB) positioning accuracy to Non Line of Sight (NLOS) errors, as well as the distortion points and drift in point clouds collected by LiDAR during robot motion, a novel positioning method is proposed. Initially, Weighted Extended Kalman Filter (W-EKF) is employed for the loosely coupled integration of wheel odometer and Ultra Wide Band (UWB) data, transformed into W-EKF pose factors. Subsequently, appropriate addition of W-EKF factors is made during the tight coupling of pre-integrated Inertial Measurement Unit (IMU) with 3D-LiDAR to counteract the distortion points, drift, and accumulated errors generated by LiDAR, thereby enhancing positioning accuracy. After experimentation, the algorithm achieved a final positioning error of only 6.9cm, representing an approximately 80% improvement in positioning accuracy compared to the loosely coupled integration of the two sensors.
    },
     year = {2024}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Indoor Positioning of AGVs Based on Multi-Sensor Data Fusion Such as LiDAR
    AU  - Wen-liang Zhu
    AU  - Shu-kai Guo
    Y1  - 2024/03/20
    PY  - 2024
    N1  - https://doi.org/10.11648/j.ijssn.20241201.12
    DO  - 10.11648/j.ijssn.20241201.12
    T2  - International Journal of Sensors and Sensor Networks
    JF  - International Journal of Sensors and Sensor Networks
    JO  - International Journal of Sensors and Sensor Networks
    SP  - 13
    EP  - 22
    PB  - Science Publishing Group
    SN  - 2329-1788
    UR  - https://doi.org/10.11648/j.ijssn.20241201.12
    AB  - In recent years, with the rapid growth in technology and demand for industrial robots, Automated Guided Vehicles (AGVs) have found widespread application in industrial workshops and smart logistics, emerging as a global hot research topic. Due to the volatile and complex working environments, the positioning technology of AGV robots is of paramount importance. To address the challenges associated with AGV robot positioning, such as significant accumulated errors in wheel odometer and Inertial Measurement Unit (IMU), susceptibility of Ultra Wide Band (UWB) positioning accuracy to Non Line of Sight (NLOS) errors, as well as the distortion points and drift in point clouds collected by LiDAR during robot motion, a novel positioning method is proposed. Initially, Weighted Extended Kalman Filter (W-EKF) is employed for the loosely coupled integration of wheel odometer and Ultra Wide Band (UWB) data, transformed into W-EKF pose factors. Subsequently, appropriate addition of W-EKF factors is made during the tight coupling of pre-integrated Inertial Measurement Unit (IMU) with 3D-LiDAR to counteract the distortion points, drift, and accumulated errors generated by LiDAR, thereby enhancing positioning accuracy. After experimentation, the algorithm achieved a final positioning error of only 6.9cm, representing an approximately 80% improvement in positioning accuracy compared to the loosely coupled integration of the two sensors.
    
    VL  - 12
    IS  - 1
    ER  - 

    Copy | Download

Author Information
  • School of Mechanical Engineering, Jiangsu Ocean University, Lianyungang, China

  • School of Mechanical Engineering, Jiangsu Ocean University, Lianyungang, China

  • Sections