| Peer-Reviewed

Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF)

Received: 28 May 2022     Accepted: 15 June 2022     Published: 19 September 2022
Views:       Downloads:
Abstract

Pose estimation has evolved into a beneficial concept in autonomous systems. It refers to the techniques used by computers to detect and quantify certain features in an image. The present work proposes modified helipad intelligent detection and pose estimation, using a fusion of camera and LiDAR. The image data are first collected using Otsu thresholding through the downward drone camera and converted to a binary image. Next, Boundary Parametric Ellipse Fitting (BPEF) algorithm is employed to detect circles, which will turn into ellipses when there is a tangential distortion in an image. Then, Ellipses Region of Interest (EROI) is extracted from the images via the potential circles. The algorithm uses a modified version of the helipad with an arrow sign located outside of the helipad’s circle. The arrow’s centroid point is located on the axial line, which horizontally splits the word “H” and passes the word’s centroid. Hence, using the proffering over-the-line-and-between-ellipses-check technique, potential arrows are extracted. A Support Vector Machine (SVM) is then trained to detect the helipad over 400 images of the word “H” and Arrow patterns. The “H” and the Arrow corners are detected and localized in the following phase. The projected LiDAR data is followingly utilized to find the corners depth information. Finally, the translational and rotational pose components are projected to obtain the corners’ coordinates and the rigid body transformation. Software-in-the-Loop (SIL) is used to assess the method accurately. The experimental setup is tuned so that the drone stays motionless over the landing platform and conducts the pose estimation. The method was compared with the AprilTag Detection Algorithm (ATDA). A statistical Root Mean Square Error (RMS) is also used to gauge the accuracy of the proffered method. The analysis results confirmed a notable improvement in rotational and translational estimations.

Published in International Journal of Sensors and Sensor Networks (Volume 10, Issue 2)
DOI 10.11648/j.ijssn.20221002.11
Page(s) 16-24
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2022. Published by Science Publishing Group

Keywords

Pose Estimation, SVM, LiDAR Camera Calibration, UAV, Landing System, Drone

References
[1] Cui, X.-Z., Feng, Q., Wang, S.-Z., & Zhang, J.-H. (2022). Monocular Depth Estimation with Self-Supervised Learning for Vineyard Unmanned Agricultural Vehicle. Sensors, 22 (3), 721. https://doi.org/10.3390/s22030721
[2] Zhang, Z., Zhou, C., Koike, Y., & Li, J. (2022). Single RGB Image 6D Object Grasping System Using Pixel-Wise Voting Network. Micromachines, 13 (2), 293. https://doi.org/10.3390/mi13020293
[3] Li, M., Chen, R., Liao, X., Guo, B., Zhang, W., & Guo, G. (2020). A Precise Indoor Visual Positioning Approach Using a Built Image Feature Database and Single User Image from Smartphone Cameras. Remote Sensing, 12 (5), 869. https://doi.org/10.3390/rs12050869
[4] Basan, E., Basan, A., Nekrasov, A., Fidge, C., Sushkin, N., & Peskova, O. (2021). GPS-Spoofing Attack Detection Technology for UAVs Based on Kullback–Leibler Divergence. Drones, 6 (1), 8. https://doi.org/10.3390/drones6010008
[5] Sefidgar, M., & Landry, R. (2022). Unstable Landing Platform Pose Estimation Based on Camera and Range Sensor Homogeneous Fusion (CRHF). Drones, 6 (3), 60. https://doi.org/10.3390/drones6030060
[6] Sefidgar, M., & Landry, R. (2022). Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF). Sensors, 22 (5), 1870. https://doi.org/10.3390/s22051870
[7] G. F. Nsogo, K. Kith, B. J. van Wyk, and M. A. van Wyk, “Robust Helipad Detection Algorithm (January 2007),” AFRICON 2007, Sep. 2007, doi: 10.1109/afrcon.2007.4401634.
[8] C. S. Sharp, O. Shakernia, and S. S. Sastry, “A vision system for landing an unmanned aerial vehicle,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), 2021, doi: 10.1109/robot.2001.932859.
[9] S. Lee, J. Jang and K. Baek, "Implementation of Vision-Based Real Time Helipad Detection System," 2012 12th International Conference on Control, Automation and Systems, 2012, pp. 191-194.
[10] P. M. Fitts, R. E. Jones, J. L. Milton, and AIR MATERIEL COMMAND WRIGHT-PATTERSON AFB OH, “Eye Fixations of Aircraft Pilots. III. Frequency, Duration, and Sequence Fixations When Flying Air Force Ground-Controlled Approach System (GCA).,” DTIC, 2021. https://apps.dtic.mil/sti/citations/ADA329371 (accessed Aug. 25, 2021).
[11] S. Lange, N. Sunderhauf and P. Protzel, "A vision based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments," 2009 International Conference on Advanced Robotics, 2009, pp. 1-6.
[12] A. Cesetti, E. Frontoni, A. Mancini, P. Zingaretti and S. Longhi, "Vision-based autonomous navigation and landing of an unmanned aerial vehicle using natural landmarks," 2009 17th Mediterranean Conference on Control and Automation, 2009, pp. 910-915, doi: 10.1109/MED.2009.5164661.
[13] Zeng Fucen, Shi Haiqing and Wang Hong, "The object recognition and adaptive threshold selection in the vision system for landing an Unmanned Aerial Vehicle," 2009 International Conference on Information and Automation, 2009, pp. 117-122, doi: 10.1109/ICINFA.2009.5204904.
[14] A. Cesetti, Emanuele Frontoni, Adriano Mancini, and Sauro Longhi, “A single-camera feature-based vision system for helicopter autonomous landing,” ResearchGate, Jul. 26, 2009. https://www.researchgate.net/publication/224567916_A_single-camera_feature-based_vision_system_for_helicopter_autonomous_landing (accessed Aug. 25, 2021).
[15] H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, “Speeded-Up Robust Features (SURF),” Computer Vision and Image Understanding, vol. 110, no. 3, pp. 346–359, Jun. 2008, doi: 10.1016/j.cviu.2007.09.014.
Cite This Article
  • APA Style

    Mohammad Sefidgar, Rene Jr. Landry. (2022). Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF). International Journal of Sensors and Sensor Networks, 10(2), 16-24. https://doi.org/10.11648/j.ijssn.20221002.11

    Copy | Download

    ACS Style

    Mohammad Sefidgar; Rene Jr. Landry. Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF). Int. J. Sens. Sens. Netw. 2022, 10(2), 16-24. doi: 10.11648/j.ijssn.20221002.11

    Copy | Download

    AMA Style

    Mohammad Sefidgar, Rene Jr. Landry. Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF). Int J Sens Sens Netw. 2022;10(2):16-24. doi: 10.11648/j.ijssn.20221002.11

    Copy | Download

  • @article{10.11648/j.ijssn.20221002.11,
      author = {Mohammad Sefidgar and Rene Jr. Landry},
      title = {Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF)},
      journal = {International Journal of Sensors and Sensor Networks},
      volume = {10},
      number = {2},
      pages = {16-24},
      doi = {10.11648/j.ijssn.20221002.11},
      url = {https://doi.org/10.11648/j.ijssn.20221002.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijssn.20221002.11},
      abstract = {Pose estimation has evolved into a beneficial concept in autonomous systems. It refers to the techniques used by computers to detect and quantify certain features in an image. The present work proposes modified helipad intelligent detection and pose estimation, using a fusion of camera and LiDAR. The image data are first collected using Otsu thresholding through the downward drone camera and converted to a binary image. Next, Boundary Parametric Ellipse Fitting (BPEF) algorithm is employed to detect circles, which will turn into ellipses when there is a tangential distortion in an image. Then, Ellipses Region of Interest (EROI) is extracted from the images via the potential circles. The algorithm uses a modified version of the helipad with an arrow sign located outside of the helipad’s circle. The arrow’s centroid point is located on the axial line, which horizontally splits the word “H” and passes the word’s centroid. Hence, using the proffering over-the-line-and-between-ellipses-check technique, potential arrows are extracted. A Support Vector Machine (SVM) is then trained to detect the helipad over 400 images of the word “H” and Arrow patterns. The “H” and the Arrow corners are detected and localized in the following phase. The projected LiDAR data is followingly utilized to find the corners depth information. Finally, the translational and rotational pose components are projected to obtain the corners’ coordinates and the rigid body transformation. Software-in-the-Loop (SIL) is used to assess the method accurately. The experimental setup is tuned so that the drone stays motionless over the landing platform and conducts the pose estimation. The method was compared with the AprilTag Detection Algorithm (ATDA). A statistical Root Mean Square Error (RMS) is also used to gauge the accuracy of the proffered method. The analysis results confirmed a notable improvement in rotational and translational estimations.},
     year = {2022}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF)
    AU  - Mohammad Sefidgar
    AU  - Rene Jr. Landry
    Y1  - 2022/09/19
    PY  - 2022
    N1  - https://doi.org/10.11648/j.ijssn.20221002.11
    DO  - 10.11648/j.ijssn.20221002.11
    T2  - International Journal of Sensors and Sensor Networks
    JF  - International Journal of Sensors and Sensor Networks
    JO  - International Journal of Sensors and Sensor Networks
    SP  - 16
    EP  - 24
    PB  - Science Publishing Group
    SN  - 2329-1788
    UR  - https://doi.org/10.11648/j.ijssn.20221002.11
    AB  - Pose estimation has evolved into a beneficial concept in autonomous systems. It refers to the techniques used by computers to detect and quantify certain features in an image. The present work proposes modified helipad intelligent detection and pose estimation, using a fusion of camera and LiDAR. The image data are first collected using Otsu thresholding through the downward drone camera and converted to a binary image. Next, Boundary Parametric Ellipse Fitting (BPEF) algorithm is employed to detect circles, which will turn into ellipses when there is a tangential distortion in an image. Then, Ellipses Region of Interest (EROI) is extracted from the images via the potential circles. The algorithm uses a modified version of the helipad with an arrow sign located outside of the helipad’s circle. The arrow’s centroid point is located on the axial line, which horizontally splits the word “H” and passes the word’s centroid. Hence, using the proffering over-the-line-and-between-ellipses-check technique, potential arrows are extracted. A Support Vector Machine (SVM) is then trained to detect the helipad over 400 images of the word “H” and Arrow patterns. The “H” and the Arrow corners are detected and localized in the following phase. The projected LiDAR data is followingly utilized to find the corners depth information. Finally, the translational and rotational pose components are projected to obtain the corners’ coordinates and the rigid body transformation. Software-in-the-Loop (SIL) is used to assess the method accurately. The experimental setup is tuned so that the drone stays motionless over the landing platform and conducts the pose estimation. The method was compared with the AprilTag Detection Algorithm (ATDA). A statistical Root Mean Square Error (RMS) is also used to gauge the accuracy of the proffered method. The analysis results confirmed a notable improvement in rotational and translational estimations.
    VL  - 10
    IS  - 2
    ER  - 

    Copy | Download

Author Information
  • Department of Electrical Engineering, ETS University of Quebec, Montreal, Canada

  • Department of Electrical Engineering, ETS University of Quebec, Montreal, Canada

  • Sections