International Journal of Sensors and Sensor Networks

Submit a Manuscript

Publishing with us to make your research visible to the widest possible audience.

Propose a Special Issue

Building a community of authors and readers to discuss the latest research and develop new ideas.

Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF)

Pose estimation has evolved into a beneficial concept in autonomous systems. It refers to the techniques used by computers to detect and quantify certain features in an image. The present work proposes modified helipad intelligent detection and pose estimation, using a fusion of camera and LiDAR. The image data are first collected using Otsu thresholding through the downward drone camera and converted to a binary image. Next, Boundary Parametric Ellipse Fitting (BPEF) algorithm is employed to detect circles, which will turn into ellipses when there is a tangential distortion in an image. Then, Ellipses Region of Interest (EROI) is extracted from the images via the potential circles. The algorithm uses a modified version of the helipad with an arrow sign located outside of the helipad’s circle. The arrow’s centroid point is located on the axial line, which horizontally splits the word “H” and passes the word’s centroid. Hence, using the proffering over-the-line-and-between-ellipses-check technique, potential arrows are extracted. A Support Vector Machine (SVM) is then trained to detect the helipad over 400 images of the word “H” and Arrow patterns. The “H” and the Arrow corners are detected and localized in the following phase. The projected LiDAR data is followingly utilized to find the corners depth information. Finally, the translational and rotational pose components are projected to obtain the corners’ coordinates and the rigid body transformation. Software-in-the-Loop (SIL) is used to assess the method accurately. The experimental setup is tuned so that the drone stays motionless over the landing platform and conducts the pose estimation. The method was compared with the AprilTag Detection Algorithm (ATDA). A statistical Root Mean Square Error (RMS) is also used to gauge the accuracy of the proffered method. The analysis results confirmed a notable improvement in rotational and translational estimations.

Pose Estimation, SVM, LiDAR Camera Calibration, UAV, Landing System, Drone

Mohammad Sefidgar, Rene Jr. Landry. (2022). Helipad Pose Estimation Using Intelligent Helipad and LiDAR Camera Fusion (IHLCF). International Journal of Sensors and Sensor Networks, 10(2), 16-24.

Copyright © 2022 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License ( which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1. Cui, X.-Z., Feng, Q., Wang, S.-Z., & Zhang, J.-H. (2022). Monocular Depth Estimation with Self-Supervised Learning for Vineyard Unmanned Agricultural Vehicle. Sensors, 22 (3), 721.
2. Zhang, Z., Zhou, C., Koike, Y., & Li, J. (2022). Single RGB Image 6D Object Grasping System Using Pixel-Wise Voting Network. Micromachines, 13 (2), 293.
3. Li, M., Chen, R., Liao, X., Guo, B., Zhang, W., & Guo, G. (2020). A Precise Indoor Visual Positioning Approach Using a Built Image Feature Database and Single User Image from Smartphone Cameras. Remote Sensing, 12 (5), 869.
4. Basan, E., Basan, A., Nekrasov, A., Fidge, C., Sushkin, N., & Peskova, O. (2021). GPS-Spoofing Attack Detection Technology for UAVs Based on Kullback–Leibler Divergence. Drones, 6 (1), 8.
5. Sefidgar, M., & Landry, R. (2022). Unstable Landing Platform Pose Estimation Based on Camera and Range Sensor Homogeneous Fusion (CRHF). Drones, 6 (3), 60.
6. Sefidgar, M., & Landry, R. (2022). Landing System Development Based on Inverse Homography Range Camera Fusion (IHRCF). Sensors, 22 (5), 1870.
7. G. F. Nsogo, K. Kith, B. J. van Wyk, and M. A. van Wyk, “Robust Helipad Detection Algorithm (January 2007),” AFRICON 2007, Sep. 2007, doi: 10.1109/afrcon.2007.4401634.
8. C. S. Sharp, O. Shakernia, and S. S. Sastry, “A vision system for landing an unmanned aerial vehicle,” Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), 2021, doi: 10.1109/robot.2001.932859.
9. S. Lee, J. Jang and K. Baek, "Implementation of Vision-Based Real Time Helipad Detection System," 2012 12th International Conference on Control, Automation and Systems, 2012, pp. 191-194.
10. P. M. Fitts, R. E. Jones, J. L. Milton, and AIR MATERIEL COMMAND WRIGHT-PATTERSON AFB OH, “Eye Fixations of Aircraft Pilots. III. Frequency, Duration, and Sequence Fixations When Flying Air Force Ground-Controlled Approach System (GCA).,” DTIC, 2021. (accessed Aug. 25, 2021).
11. S. Lange, N. Sunderhauf and P. Protzel, "A vision based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments," 2009 International Conference on Advanced Robotics, 2009, pp. 1-6.
12. A. Cesetti, E. Frontoni, A. Mancini, P. Zingaretti and S. Longhi, "Vision-based autonomous navigation and landing of an unmanned aerial vehicle using natural landmarks," 2009 17th Mediterranean Conference on Control and Automation, 2009, pp. 910-915, doi: 10.1109/MED.2009.5164661.
13. Zeng Fucen, Shi Haiqing and Wang Hong, "The object recognition and adaptive threshold selection in the vision system for landing an Unmanned Aerial Vehicle," 2009 International Conference on Information and Automation, 2009, pp. 117-122, doi: 10.1109/ICINFA.2009.5204904.
14. A. Cesetti, Emanuele Frontoni, Adriano Mancini, and Sauro Longhi, “A single-camera feature-based vision system for helicopter autonomous landing,” ResearchGate, Jul. 26, 2009. (accessed Aug. 25, 2021).
15. H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, “Speeded-Up Robust Features (SURF),” Computer Vision and Image Understanding, vol. 110, no. 3, pp. 346–359, Jun. 2008, doi: 10.1016/j.cviu.2007.09.014.