Technical Papers
Oct 23, 2015

An Algorithm for Autonomous Aerial Navigation Using Landmarks

Publication: Journal of Aerospace Engineering
Volume 29, Issue 3

Abstract

This paper describes a novel approach for vision-based passive navigation of unmanned aerial vehicles (UAVs) suitable for use in outdoor environments. The researchers chose a number of coordinate points on a flat earth model as waypoints. At each waypoint, a number of objects were chosen as landmarks which provided a unique polygonal constellation. Features of these landmarks and waypoints were computed in advance and stored in the database. A 6 degree of freedom kinematic model of a UAV flew from one waypoint to the next waypoint in a detailed simulation which included real aerial imagery. An image of the terrain was captured while approaching the waypoint. An illumination, scale, and rotation invariant algorithm was used to extract landmarks and waypoint features. These features were compared with those in the database. Position drift was computed at each waypoint and used to update the current position of the UAV prior to heading towards the next waypoint. The drift calculated by the vision-based algorithm was used to estimate the error caused primarily by wind and thus estimate the wind speed and direction. Experiments with both computer generated images and real images taken from the UAV flight trial have demonstrated the technique in the presence of wind and Gaussian noise. These results show accuracy of the drift computation algorithm and reliability of the feature matching algorithm under various environmental conditions. These algorithms were compared against other popular algorithms in the field and demonstrate higher performance.

Get full access to this article

View all available purchase options and get full access to this article.

References

Bachrach, A., Prentice, S., He, R., and Roy, N. (2011). “Range-robust autonomous navigation in GPS-denied environments.” J. Field Rob., 28(5), 644–666.
Bay, H., Ess, A., Tuytelaars, T., and Van Gool, L. (2008). “Speeded-up robust features (surf).” Comput. Vision Image Understanding, 110(3), 346–359.
Bethke, B., Valenti, M., and How, J. (2007). “Cooperative vision based estimation and tracking using multiple UAVs.” Advances in cooperative control and optimization, Springer, Berlin, 179–189.
Blosch, M., Weiss, S., Scaramuzza, D., and Siegwart, R. (2010). “Vision based MAV navigation in unknown and unstructured environments.” Robotics and Automation (ICRA), 2010 IEEE Int. Conf., IEEE, Anchorage, AK, 21–28.
Bonin-Font, F., Ortiz, A., and Oliver, G. (2008). “Visual navigation for mobile robots: A survey.” J. Intell. Rob. Syst., 53(3), 263–296.
Botterill, T., Mills, S., and Green, R. (2011). “Bag-of-words-driven, single-camera simultaneous localization and mapping.” J. Field Rob., 28(2), 204–226.
Broggi, A., and Berte, S. (1995). “Vision-based road detection in automotive systems: A real-time expectation-driven approach.” J. Artif. Intell. Res., 3, 325–348.
Cesetti, A., Frontoni, E., Mancini, A., Zingaretti, P., and Longhi, S. (2010). “A vision-based guidance system for UAV navigation and safe landing using natural landmarks.” J. Intell. Rob. Syst., 57(1), 233–257.
Chahl, J., and Mizutani, A. (2006). “An algorithm for terrain avoidance using optical flow.” Proc., American Control Conf., IEEE, New York, 6.
Chatterji, G., Menon, P., and Sridhar, B. (1997). “Gps/machine vision navigation system for aircraft.” IEEE Trans. Aerosp. Electron. Syst., 33(3), 1012–1025.
Chen, K.-C., and Tsai, W.-H. (2010). “Vision-based autonomous vehicle guidance for indoor security patrolling by a SIFT-based vehicle-localization technique.” IEEE Trans. Veh. Technol., 59(7), 3261–3271.
Choi, J.-H., Park, Y.-W., Song, J.-B., and Kweon, I.-S. (2011). “Localization using GPS and vision aided ins with an image database and a network of a ground-based reference station in outdoor environments.” Int. J. Control Autom. Syst., 9(4), 716–725.
Chowdhary, G., Johnson, E. N., Magree, D., Wu, A., and Shein, A. (2013). “GPS-denied indoor and outdoor monocular vision aided navigation and control of unmanned aircraft.” J. Field Rob., 30(3), 415–438.
Conte, G., and Doherty, P. (2008). “An integrated UAV navigation system based on aerial image matching.” IEEE Aerospace Conf., IEEE, New York, 1–10.
Courbon, J., Mezouar, Y., Guénard, N., and Martinet, P. (2010). “Vision-based navigation of unmanned aerial vehicles.” Control Eng. Pract., 18(7), 789–799.
Dawadee, A., Chahl, J., Nandagopal, D., and Nedic, Z. (2013). “Illumination, scale and rotation invariant algorithm for vision-based UAV navigation.” Int. J. Pattern Recognit. Artif. Intell., 27(05), 1359003.
Durrant-Whyte, H., and Bailey, T. (2006). “Simultaneous localization and mapping: Part I.” Rob. Autom. Mag. IEEE, 13(2), 99–110.
Egbert, J., and Beard, R. W. (2011). “Low-altitude road following using strap-down cameras on miniature air vehicles.” Mechatronics, 21(5), 831–843.
Garratt, M., and Chahl, J. (2007). “An optic flow damped hover controller for an autonomous helicopter.” Proc., 22nd Int. UAV Systems Conf., Bristol, U.K., 16–18.
Germa, T., Lerasle, F., Ouadah, N., and Cadenat, V. (2010). “Vision and RFID data fusion for tracking people in crowds by a mobile robot.” Comput. Vision Image Understanding, 114(6), 641–651.
Giachetti, A., Campani, M., and Torre, V. (1998). “The use of optical flow for road navigation.” IEEE Trans. Rob. Autom., 14(1), 34–48.
Goedemé, T., Nuttin, M., Tuytelaars, T., and Van Gool, L. (2007). “Omnidirectional vision based topological navigation.” Int. J. Comput. Vision, 74(3), 219–236.
Google version 7.1.2.2041 [Computer software]. Google Earth, Mountain View, CA.
Harris, C., and Stephens, M. (1988). “A combined corner and edge detector.” Proc., Alvey Vision Conf., Vol. 15, Manchester, U.K., 50.
Hérissé, B., Hamel, T., Mahony, R., and Russotto, F.-X. (2010). “A terrain-following control approach for a VTOL unmanned aerial vehicle using average optical flow.” Auton. Rob., 29(3–4), 381–399.
Holt, R. S., and Beard, R. W. (2010). “Vision-based road-following using proportional navigation.” J. Intell. Rob. Syst., 57(1–4), 193–216.
Hrabar, S., and Sukhatme, G. (2009). “Vision-based navigation through urban canyons.” J. Field Rob., 26(5), 431–452.
Jones, E. S., and Soatto, S. (2011). “Visual-inertial navigation, mapping and localization: A scalable real-time causal approach.” Int. J. Rob. Res., 30(4), 407–430.
Jung, C. R., and Kelber, C. R. (2005). “Lane following and lane departure using a linear-parabolic model.” Image Vision Comput., 23(13), 1192–1202.
Kayton, M., and Fried, W. R. (1997). Avionics navigation systems, Vol. 2, Wiley.
Kessler, C., Ascher, C., Frietsch, N., Weinmann, M., and Trommer, G. F. (2010). “Vision-based attitude estimation for indoor navigation using vanishing points and lines.” IEEE/ION Position Location and Navigation Symp. (PLANS), IEEE, 310–318.
Kupervasser, O., Lerner, R., Rivlin, E., and Rotstein, H. (2008). “Error analysis for a navigation algorithm based on optical-flow and a digital terrain map.” Position, Location and Navigation Symp., 2008 IEEE/ION, IEEE, 1203–1212.
Lee, S.-Y., and Kwak, D.-M. (2011). “A terrain classification method for UGV autonomous navigation based on SURF.” 8th Int. Conf. on Ubiquitous Robots and Ambient Intelligence (URAI), IEEE, 303–306.
Lowe, D. G. (2004). “Distinctive image features from scale-invariant keypoints.” Int. J. Comput. Vision, 60(2), 91–110.
Martinelli, A. (2012). “Vision and imu data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination.” IEEE Trans. Rob., 28(1), 44–60.
MATLAB version 8.1.0.604 [Computer software]. MathWorks, Natick, MA.
Meier, L., Tanskanen, P., Heng, L., Lee, G. H., Fraundorfer, F., and Pollefeys, M. (2012). “Pixhawk: A micro aerial vehicle design for autonomous flight using onboard computer vision.” Auton. Rob., 33(1–2), 21–39.
Mejias, L., Saripalli, S., Campoy, P., and Sukhatme, G. S. (2006). “Visual servoing of an autonomous helicopter in urban areas using feature tracking.” J. Field Rob., 23(3–4), 185–199.
Merino, L., et al. (2007). “Single and multi-UAV relative position estimation based on natural landmarks.” Advances in unmanned aerial vehicles, Springer, 267–307.
Miksik, O., Petyovsky, P., Zalud, L., and Jura, P. (2011). “Robust detection of shady and highlighted roads for monocular camera based navigation of UGV.” IEEE Int. Conf. on Robotics and Automation (ICRA), IEEE, 64–71.
Mondragón, I. F., Campoy, P., Martinez, C., and Olivares, M. (2010). “Omnidirectional vision applied to unmanned aerial vehicles (UAVs) attitude and heading estimation.” Rob. Auton. Syst., 58(6), 809–819.
Natraj, A., Ly, D. S., Eynard, D., Demonceaux, C., and Vasseur, P. (2013). “Omnidirectional vision for UAV: Applications to attitude, motion and altitude estimation for day and night conditions.” J. Intell. Rob. Syst., 69(1–4), 459–473.
Park, J., Im, S., Lee, K.-H., and Lee, J.-O. (2011). “Vision-based slam system for small UAVs in gps-denied environments.” J. Aerosp. Eng., 519–529.
POVRay version 3.7 [Computer software]. Persistence of vision, Williamstown, VIC, Australia.
Rondon, E., Garcia-Carrillo, L.-R., and Fantoni, I. (2010). “Vision-based altitude, position and speed regulation of a quadrotor rotorcraft.” Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ Int. Conf., IEEE, 628–633.
Sazdovski, V., and Silson, P. M. (2011). “Inertial navigation aided by vision-based simultaneous localization and mapping.” IEEE Sens. J., 11(8), 1646–1656.
Schlaile, C., Meister, O., Frietsch, N., Keßler, C., Wendel, J., and Trommer, G. F. (2009). “Using natural features for vision based navigation of an indoor-VTOL MAV.” Aerosp. Sci. Technol., 13(7), 349–357.
Shakernia, O., Ma, Y., Koo, T. J., and Sastry, S. (1999). “Landing an unmanned air vehicle: Vision based motion estimation and nonlinear control.” Asian J. Control, 1(3), 128–145.
Shinzato, P. Y., and Wolf, D. F. (2011). “A road following approach using artificial neural networks combinations.” J. Intell. Rob. Syst., 62(3–4), 527–546.
Song, B., et al. (2006). “Visualizing a landscape, its changes, and driving processes.” Ecol. Hierarchical Landscapes, 167–190.
Stelzer, A., Hirschmüller, H., and Görner, M. (2012). “Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain.” Int. J. Rob. Res., 31(4), 381–402.
Wald, I., and Slusallek, P. (2001). “State of the art in interactive ray tracing.”, 21–42.
Wu, A. D., Johnson, E. N., and Proctor, A. A. (2005). “Vision-aided inertial navigation for flight control.” J. Aerosp. Comput. Inf. Commun., 2(9), 348–360.
Zhang, J., Liu, W., and Wu, Y. (2011). “Novel technique for vision-based UAV navigation.” IEEE Trans. Aerosp. Electron. Syst., 47(4), 2731–2741.
Zhang, J., Wu, Y., Liu, W., and Chen, X. (2010). “Novel approach to position and orientation estimation in vision-based UAV navigation.” IEEE Trans. Aerosp. Electron. Syst., 46(2), 687–700.

Information & Authors

Information

Published In

Go to Journal of Aerospace Engineering
Journal of Aerospace Engineering
Volume 29Issue 3May 2016

History

Received: Apr 16, 2014
Accepted: Jul 23, 2015
Published online: Oct 23, 2015
Discussion open until: Mar 23, 2016
Published in print: May 1, 2016

Permissions

Request permissions for this article.

Authors

Affiliations

Aakash Dawadee [email protected]
Ph.D. Student, School of Engineering, Univ. of South Australia, University Blvd., Mawson Lakes, SA 5095, Australia (corresponding author). E-mail: [email protected]
Javaan Chahl
Professor of Sensor Systems, School of Engineering, Univ. of South Australia, University Blvd., Mawson Lakes, SA 5095, Australia.
D. (Nanda) Nandagopal
Professor of Defence Systems, Division of IT, Engineering and the Environment, Univ. of South Australia, University Blvd., Mawson Lakes, SA 5095, Australia.

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share