Hostname: page-component-7c8c6479df-7qhmt Total loading time: 0 Render date: 2024-03-19T03:44:36.926Z Has data issue: false hasContentIssue false

A survey on stereo vision-based autonomous navigation for multi-rotor MUAVs

Published online by Cambridge University Press:  06 May 2018

Jose-Pablo Sanchez-Rodriguez*
Affiliation:
Tecnologico de Monterrey, Atizapán de Zaragoza, Estado de México, Z.C. 52926, México. E-mail: aaceves@itesm.mx
Alejandro Aceves-Lopez
Affiliation:
Tecnologico de Monterrey, Atizapán de Zaragoza, Estado de México, Z.C. 52926, México. E-mail: aaceves@itesm.mx
*
*Corresponding author. E-mail: pablo270991@gmail.com

Summary

This paper presents an overview of the most recent vision-based multi-rotor micro unmanned aerial vehicles (MUAVs) intended for autonomous navigation using a stereoscopic camera. Drone operation is difficult because pilots need the expertise to fly the drones. Pilots have a limited field of view, and unfortunate situations, such as loss of line of sight or collision with objects such as wires and branches, can happen. Autonomous navigation is an even more difficult challenge than remote control navigation because the drones must make decisions on their own in real time and simultaneously build maps of their surroundings if none is available. Moreover, MUAVs are limited in terms of useful payload capability and energy consumption. Therefore, a drone must be equipped with small sensors, and it must carry low weight. In addition, a drone requires a sufficiently powerful onboard computer so that it can understand its surroundings and navigate accordingly to achieve its goal safely. A stereoscopic camera is considered a suitable sensor because of its three-dimensional (3D) capabilities. Hence, a drone can perform vision-based navigation through object recognition and self-localise inside a map if one is available; otherwise, its autonomous navigation creates a simultaneous localisation and mapping problem.

Type
Articles
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Zhang, J., Liu, W. and Wu, Y., “Novel technique for vision-based UAV navigation,” IEEE Trans. Aerosp. Electron. Syst. 47 (4), 27312741 (2011).Google Scholar
2. Zhu, K. and Lin, F., “Image Super-Resolution Reconstruction by Sparse Decomposition and Scale-Invariant Feature Retrieval in Micro-UAV Stereo Vision,” Proceedings of the 11th IEEE International Conference on Control and Automation ICCA, Taichung, Taiwan Image (Jun. 18–20, 2014) pp. 705–710.Google Scholar
3. Azevedo, V. B., De Souza, A. F., Veronese, L. P., Badue, C. and Berger, M., “Real-Time Road Surface Mapping Using Stereo Matching, V-Disparity and Machine Learning,” Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN (Aug. 2013) pp. 1–8.Google Scholar
4. Dawadee, A., Chahl, J. and Nandagopal, D. N., “An algorithm for autonomous aerial navigation using landmarks,” J. Aerosp. Eng. 29 (May 2016) pp. 127.CrossRefGoogle Scholar
5. Vetrella, A. R. and Fasano, G., “Cooperative UAV Navigation Under Nominal GPS Coverage and in GPS-Challenging Environments,” Proceedings of the IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a Better Tomorrow RTSI (Sep. 2016), pp. 1–5.Google Scholar
6. Skulstad, R., Syversen, C. L., Merz, M., Sokolova, N., Fossen, T. I. and Johansen, T. A., “Net Recovery of UAV with Single-Frequency RTK GPS,” Proceedings of the IEEE Aerospace Conference (Mar. 2015), pp. 1–10.Google Scholar
7. Aditya, A., “Implementation of a 4D Fast SLAM Including Volumetric Sum of the UAV,” Proceedings of the International Conference on Sensing Technology ICST (2012) pp. 78–83.Google Scholar
8. Wu, J., Fei, W. and Li, Q., “An Integrated Measure and Location Method Based on Airborne 2D Laser Scanning Sensor for UAV's Power Line Inspection,” Proceedings of the 5th Conference on Measuring Technology and Mechatronics Automation ICMTMA (2013) pp. 213–217.Google Scholar
9. Li, R., Liu, J., Zhang, L. and Hang, Y., “LIDAR/MEMS IMU Integrated Navigation (SLAM) Method for a Small UAV in Indoor Environments,” Proceedings of the IEEE DGON Inertial Sensors and Systems ISS (Sep. 2014) pp. 1–15.Google Scholar
10. Wang, F., Wang, K., Lai, S., Phang, S. K., Chen, B. M. and Lee, T. H., “An Efficient UAV Navigation Solution for Confined but Partially Known Indoor Environments,” Proceedings of the IEEE International Conference on Control and Automation ICCA (2014) pp. 1351–1356.Google Scholar
11. Braga, J. R. G., Velho, H. F. D. C. and Shiguemori, E. H., “Estimation of UAV Position Using LiDAR Images for Autonomous Navigation Over the Ocean,” Proceedings of the IEEE 9th International Conference on Sensing Technology ICST, (Dec. 2015) pp. 811–816.Google Scholar
12. Opromolla, R., Fasano, G., Rufino, G., Grassi, M. and Savvaris, A., “LIDAR-Inertial Integration for UAV Localization and Mapping in Complex Environments,” Proceedings of the International Conference on Unmanned Aircraft Systems ICUAS (2016) pp. 649–656.Google Scholar
13. Masuko, K., Takahashi, I., Ogawa, S., Wu, M.-H., Oosedo, A., Matsumoto, T., Go, K., Sugai, F., Konno, A. and Uchiyama, M., “Autonomous Takeoff and Landing of an Unmanned Aerial Vehicle,” Proceedings of the IEEE/SICE International Symposium on System Integration (2010) pp. 248–253.Google Scholar
14. Yao, Z., Bin, X., Qiang, Y., Yang, L. and Fu, W., “Autonomous Control System for the Quadrotor Unmanned Aerial Vehicle,” Proceedings of the 31st Chinese Control Conference (Jul. 2012) pp. 4862–4867.Google Scholar
15. Gageik, N., Strohmeier, M. and Montenegro, S., “An autonomous UAV with an optical flow sensor for positioning and navigation,” Int. J. Adv. Robot. Syst. 10, p. 341 (Oct. 2013).Google Scholar
16. More, V., Kumar, H., Kaingade, S., Gaidhani, P. and Gupta, N., “Visual Odometry Using Optic Flow for Unmanned Aerial Vehicles,” Proceedings of the IEEE International Conference on Cognitive Computing and Information Processing CCIP (Mar. 2015) pp. 1–6.Google Scholar
17. Jesus, F. and Ventura, R., “Combining Monocular and Stereo Vision in 6D-SLAM for the Localization of a Tracked Wheel Robot,” Proceedings of the IEEE International Symposium on Safety, Security, and Rescue Robotics SSRR (Nov. 2012) pp. 1–6.Google Scholar
18. Mattar, E., Al-Mutib, K., Alsulaiman, M., Ramdane, H. and Emaduddin, M., “A Survey: Intelligent Based Mobile Robots Stereo Vision Maps Synthesis and Learning Methodologies,” Proceedings of the 5th International Conference on Intelligent Systems, Modelling and Simulation (2014) pp. 94–98.Google Scholar
19. Moratuwage, D., Wang, D., Rao, A., Senarathne, N. and Wang, H., “RFS collaborative multivehicle slam: Slam in dynamic high-clutter environments,” IEEE Robot. Autom. Mag. 21 (2), 5359 (2014).CrossRefGoogle Scholar
20. Stasse, O., Davison, A., Sellaouti, R. and Yokoi, K., “Real-Time 3D SLAM for Humanoid Robot Considering Pattern Generator Information,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (Oct. 2006) pp. 348–355.Google Scholar
21. Ahn, S., Yoon, S., Hyung, S., Kwak, N. and Roh, K. S., “On-Board Odometry Estimation For 3D Vision-Based SLAM of Humanoid Robot,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (Oct. 2012) pp. 4006–4012.Google Scholar
22. Lili Meng, C. de Silva, W. and Zhang, Jie, “3D Visual SLAM for an Assistive Robot in Indoor Environments Using RGB-D Cameras,” Proceedings of the 9th IEEE International Conference on Computer Science & Education (Aug. 2014) pp. 32–37.CrossRefGoogle Scholar
23. Fallon, M. F., Folkesson, J., McClelland, H. and Leonard, J. J., “Relocating Underwater Features Autonomously Using Sonar-Based SLAM,” IEEE J. Ocean. Eng. 38, 500513 (Jul. 2013).Google Scholar
24. Hidalgo, F. and Braunl, T., “Review of Underwater SLAM Techniques,” Proceedings of the IEEE 6th International Conference on Automation, Robotics and Applications ICARA (Feb. 2015) pp. 306–311.Google Scholar
25. Thamrin, N. M., Arshad, N. H. M., Adnan, R., Sam, R., Razak, N. A., Misnan, M. F. and Mahmud, S. F., “Simultaneous Localization and Mapping Based Real-Time Inter-Row Tree Tracking Technique for Unmanned Aerial Vehicle,” Proceedings of the IEEE International Conference on Control System, Computing and Engineering (Nov. 2012) pp. 322–327.Google Scholar
26. Bryson, M. and Sukkarieh, S., “Observability analysis and active control for airborne SLAM,” IEEE Trans. Aerosp. Electron. Syst. 44, 261280 (Jan. 2008).Google Scholar
27. Li, R., Liu, J., Zhang, L. and Hang, Y., “LIDAR/MEMS IMU Integrated Navigation (SLAM) Method for a Small UAV in Indoor Environments,” Proceedings of the IEEE DGON Inertial Sensors and Systems ISS (Sep. 2014) pp. 1–15.Google Scholar
28. Durrant-Whyte, H. and Bailey, T., “Simultaneous localization and mapping: Part I,” IEEE Robot. Autom. Mag. 13, 99110 (Jun. 2006).Google Scholar
29. Bailey, T. and Durrant-Whyte, H., “Simultaneous localization and mapping (SLAM): Part II,” IEEE Robot. Autom. Mag. 13, 108117 (Sep. 2006).Google Scholar
30. Thrun, S., Probabilistic Robotics (The MIT Press, Cambridge, MA; London, UK, 2005).Google Scholar
31. Davison, A. J., Reid, I. D., Molton, N. D. and Stasse, O., “MonoSLAM: Real-time single camera SLAM,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 10521067 (Jun. 2007).Google Scholar
32. Corke, P., Robotics, Vision and Control, Springer Tracts in Advanced Robotics, vol. 73 (Springer, Berlin Heidelberg, 2011).CrossRefGoogle Scholar
33. Klingensmith, M., “Overview of motion planning,” Accessed on 19 August 2016, http://www.gamasutra.com/blogs/MattKlingensmith/20130907/199787/Overview_of_Motion_Planning.php (2013).Google Scholar
34. Denny, J., Rodriguez, S. and Amato, N., “Algorithms & applications group. Adapting RRT growth for heterogeneous environments,” Accessed on 19 August 2016, https://parasol.tamu.edu/groups/amatogroup/research/AdaptiveRRT/, (2013).Google Scholar
35. Geraerts, R. J., Sampling-Based Motion Planning: Analysis and Path Quality Ph.D. Thesis (Utrecht University, 2006).Google Scholar
36. Choset, H., “Robotic Motion Planning: Roadmap Methods,” Tech. Rep. 16-735, (Carnegie Mellon University, 2007), Pittsburg, PA, USA, Accessed on 20 August 2016, http://www.cs.cmu.edu/~motionplanning/lecture/Chap5-RoadMap-Methods_howie.pdfGoogle Scholar
37. Choi, H., Kim, Y., Lee, Y. and KIM, E. T., “A reactive collision avoidance algorithm for multiple midair unmanned aerial vehicles,” Trans. Jpn. Soc. Aeronaut. Sp. Sci. 56 (1), 1524 (2013).CrossRefGoogle Scholar
38. Maravall, D., de Lope, J. and Pablo Fuentes, J., “Vision-based anticipatory controller for the autonomous navigation of an UAV using artificial neural networks,” Neurocomputing 151, 101107 (Mar. 2015).Google Scholar
39. Emel'yanov, S., Makarov, D., Panov, A. I. and Yakovlev, K., “Multilayer cognitive architecture for UAV control,” Cogn. Syst. Res. 39, 5872 (2016).CrossRefGoogle Scholar
40. Thrun, S., “Learning metric-topological maps for indoor mobile robot navigation,” Artif. Intell. 99, 2171 (Feb. 1998).CrossRefGoogle Scholar
41. Park, J., Kim, Y. and Kim, S., “Landing site searching and selection algorithm development using vision system and its application to quadrotor,” IEEE Trans. Control Syst. Technol. 23, 488503 (Mar. 2015).CrossRefGoogle Scholar
42. Cook, Z., Zhao, L., Lee, J. and Yim, W., “Unmanned Aerial System for First Responders,” Proceedings of the IEEE 12th International Conference on Ubiquitous Robots and Ambient Intelligence URAI (Oct. 2015) pp. 306–310.CrossRefGoogle Scholar
43. Derpanis, K.G., “The Harris Corner Detector”, Technical Report, York University, Available: www.cse.yorku.ca/kosta/CompVisNotes/harrisdetector.pdf (Oct. 2004), pp. 2–3.Google Scholar
44. Brown, M., Szeliski, R. and Winder, S., “Multi-Image Matching Using Multi-Scale Oriented Patches,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR, vol. 1 (2005) pp. 510–517.Google Scholar
45. Lowe, D. G., “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60, 91110 (Nov. 2004).CrossRefGoogle Scholar
46. Bay, H., Ess, A., Tuytelaars, T. and Van Gool, L., “Speeded-up robust features (SURF),” Comput. Vis. Image Underst. 110, 346359 (Jun. 2008).Google Scholar
47. Fischler, M. A. and Bolles, R. C., “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24 (6), 381395 (1981).Google Scholar
48. Besl, P. and McKay, N. D., “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach Intell. 14, 239256 (Feb. 1992).Google Scholar
49. Gold, S., Rangarajan, A., Lu, C.-P., Pappu, S. and Mjolsness, E., “New algorithms for 2D and 3D point matching,” Pattern Recog. 31, 10191031 (Aug. 1998).CrossRefGoogle Scholar
50. Tsin, Y. and Kanade, T., A Correlation-Based Approach to Robust Point Set Registration (Springer, Berlin Heidelberg, 2004) pp. 558569.Google Scholar
51. Myronenko, A. and Song, Xubo, “Point set registration: Coherent point drift,” IEEE Trans. Pattern Anal. Mach. Intell 32, 22622275 (Dec. 2010).Google Scholar
52. Thomson Reuters, “Web of science,” Accessed on 13 February 2017, http://apps.webofknowledge.com/Google Scholar
53. Longuet-Higgins, H. C., “A computer algorithm for reconstructing a scene from two projections,” Nature 293, 133135 (Sep. 1981).CrossRefGoogle Scholar
54. Hartley, R. and Ziserman, A., Multiple View Geometry in Computer Vision, vol. 2, 2nd ed. (Cambridge: Cambridge University Press, 2004).CrossRefGoogle Scholar
55. Lepetit, V., Moreno-Noguer, F. and Fua, P., “EPNP: An accurate o(n) solution to the PNP problem,” Int. J. Comput. Vis. 81 (2), (2009) pp. 155166.Google Scholar
56. Makadia, A., Geyer, C. and Daniilidis, K., “Correspondence-free structure from motion,” Int. J. Comput. Vis. 75, 311327 (Sep. 2007).Google Scholar
57. Lee, J., Lee, K., Park, S., Im, S. and Park, J., “Obstacle avoidance for small UAVs using monocular vision,” Aircr. Aerosp. Technol. 83, 397406 (Oct. 2011).Google Scholar
58. Ferworn, A., Herman, S., Tran, J., Ufkes, A. and Mcdonald, R., “Disaster Scene Reconstruction: Modeling And Simulating Urban Building Collapse Rubble Within A Game Engine,” Proceedings of the Summer Computer Simulation Conference SCSC, Vista, CA (Society for Modeling, Simulation International, 2013) pp. 18:1–18:6.Google Scholar
59. Omari, S., Bloesch, M., Gohl, P. and Siegwart, R., “Dense Visual-Inertial Navigation System For Mobile Robots,” Proceedings of the International Conference on Robotics and Automation ICRA (2015) pp. 2634–2640.Google Scholar
60. Carloni, R., Lippiello, V., DAuria, M., Fumagalli, M., Mersha, A. Y., Stramigioli, S. and Siciliano, B., “Robot vision: Obstacle-avoidance techniques for unmanned aerial vehicles,” IEEE Robot. Autom. Mag. 20 (4), 2231 (2013).Google Scholar
61. Harmat, A. and Sharf, I., “Towards Full Omnidirectional Depth Sensing Using Active Vision for Small Unmanned Aerial Vehicles,” Proceedings of the Canadian Conference on Computer and Robot Vision (2014) pp. 24–31.Google Scholar
62. Warren Mellinger, D., Trajectory Generation and Control for Quadrotors. Dissertation for Doctor of Philosophy ph.d. thesis (University of Pennsylvania, 2012).Google Scholar
63. Shen, S., Autonomous Navigation In Complex Indoor and Outdoor Environments With Micro Aerial Vehicles. A Dissertation for the Degree of Doctor of Philosophy Ph.D. Thesis (University of Pennsylvania 2014).Google Scholar
64. Kwon, J.-W., Seo, J. and Kim, J. H., “Multi-UAV-based stereo vision system without GPS for ground obstacle mapping to assist path planning of UGV,” Electron. Lett. 50, 14311432 (Sep. 2014).Google Scholar
65. Charrow, B., Information-Theoretic Active For Multi-Robot Teams. A Dissertation for Degree of Doctor of Philosophy Ph.D. Thesis (University of Pennsylvania, 2015).Google Scholar
66. García Carrillo, L. R., Dzul López, A. E., Lozano, R. and Pégard, C., “Combining stereo vision and inertial navigation system for a quad-rotor UAV,” J. Intell. Robot. Syst. 65, 373387 (Jan. 2012).Google Scholar
67. Lucas, B. D. and Kanade, T., “An Iterative Image Registration Technique With an Application to Stereo Vision,” Proceedings of the 7th International Joint Conference on Artificial Intelligence (1981) pp. 674–679.Google Scholar
68. Schneider, J., Schindler, F., Läbe, T. and Förstner, W., “Bundle Adjustment for Multi-Camera Systems with Points at Infinity,” ISPRS Annals of the Photogrammetry, Remote Sensing, and Spatial Information Sciences, vol. 1–3, (Sep. 2012) pp. 75–80.Google Scholar
69. Vandapel, N., Kuffner, J. and Amidi, O., “Planning 3-D Path Networks in Unstructured Environments,” Proceedings of the IEEE International Conference on Robotics and Automation (2005) pp. 4624–4629.Google Scholar
70. Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., Grixa, I., Ruess, F., Suppa, M. and Burschka, D., “Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue,” IEEE Robot. Autom. Mag. 19 (3), 4656 (2012).Google Scholar
71. Griesbach, D., Baumbach, D. and Zuev, S., “Stereo-Vision-Aided Inertial Navigation for Unknown Indoor and Outdoor Environments,” Proceedings of the IEEE International Conference on Indoor Positioning and Indoor Navigation IPIN (Oct. 2014) pp. 709–716.Google Scholar
72. Schauwecker, K. and Zell, A., “On-board dual-stereo-vision for the navigation of an autonomous MAV,” J. Intell. Robot. Syst. 74 (1–2), 116 (2014).Google Scholar
73. Schmid, K., Lutz, P., Tomi, T., Mair, E. and Hirschmller, H., “Autonomous vision-based micro air vehicle for indoor and outdoor navigation,” J. Field Robot. 31, 537570 (Jul. 2014).CrossRefGoogle Scholar
74. Fu, C., Carrio, A. and Campoy, P., “Efficient Visual Odometry and Mapping for Unmanned Aerial Vehicle Using Arm-Based Stereo Vision Pre-Processing System,” Proceedings of the International Conference on Unmanned Aircraft Systems ICUAS Denver Marriott Tech Center Denver, Colorado, USA (Jun. 9–12, 2015) pp. 957–962.Google Scholar
75. Roma, N., Santos-Victor, J., and Tomé, J., “A Comparative Analysis of Cross-Correlation Matching Algorithms Using a Pyramidal Resolution Approach,” Empir. Eval. Methods Comput. Vis., vol. 6, (2012), pp. 117–142.Google Scholar
76. Jain, R., Kasturi, R. and Schunck, B., Machine Vision. Computer Science Series (USA: McGraw-Hill, 1995).Google Scholar
77. Frontoni, E., Mancini, A. and Zingaretti, P., “UAVs Safe Landing Using Range Images,” Proceedings of the ASME/IEEE International Conference on Mechatronic and Embedded Systems and Applications, Parts A and B, vol. 3 (2011) pp. 1047–1052.Google Scholar
78. Tweedale, J. W., “Fuzzy Control Loop in an Autonomous Landing System for Unmanned Air Vehicles,” Proceedings of the IEEE International Conference on Fuzzy Systems (2012) pp. 1–8.Google Scholar
79. Kim, G. B., Nguyen, T. K., Budiyono, A., Park, J. K., Yoon, K. J. and Shin, J., “Design and development of a class of rotorcraft-based UAV,” Int. J. Adv. Robot. Syst. 10 (2013) pp. 19.Google Scholar
80. Budiyono, A., Lee, G., Kim, G. B., Park, J., Kang, T. and Yoon, K. J., “Control system design of a quad-rotor with collision detection,” Aircr. Eng. Aerosp. Technol. 87 (1), 5966 (2015).Google Scholar
81. Kalman, R. E. et al., “A new approach to linear filtering and prediction problems,” J. Basic Eng. 82, 3545 (1960).Google Scholar
82. Thrun, S., Brooks, R. and Durrant-Whyte, H., Robotics Research, Springer Tracts in Advanced Robotics, vol. 28 (Springer, Berlin Heidelberg, 2007).Google Scholar
83. Yang, S., Scherer, S. A., Yi, X. and Zell, A., “Multi-camera visual SLAM for autonomous navigation of micro aerial vehicles,” Robot. Auton. Syst. 93, 116134 (2017).Google Scholar
84. Gomez-Ojeda, R., Moreno, F., Scaramuzza, D. and Jiménez, J. G., “PL-SLAM: A Stereo SLAM System Through the Combination of Points and Line Segments,” CoRR. arXiv: 1705.09479, (2017), pp. 1–13.Google Scholar