Case Studies
Mar 6, 2013

Feature Conjugation for Intensity-Coded LIDAR Point Clouds

Publication: Journal of Surveying Engineering
Volume 139, Issue 3

Abstract

Feature conjugation is a major task in modern-day spatial analysis and contributes to efficient integration across multiple data sets. In this study, an efficient approach that utilizes the intensity information provided in most light detection and ranging (LIDAR) data sets for feature conjugation is proposed. First, a two-dimensional (2D) intensity map is generated based on the original intensity-coded LIDAR observables in three-dimensional (3D) space. The 2D map is further transformed into a regularly sampled image, and an image feature detection technique is subsequently applied to identify point conjugations between a pair of intensity maps. Finally, the paired conjugations in the image space are mapped backward into the LIDAR space, and the object coordinates of the conjugate points can be verified and obtained. Based on the numerical results from a real world case study, it is illustrated that by fully exploring the existing spectral information, a reliable feature conjugation across multiple LIDAR data sets can be easily achieved in an efficient and automatic manner.

Get full access to this article

View all available purchase options and get full access to this article.

Acknowledgments

The authors thank the editor and three anonymous reviewers for their constructive comments, which significantly improved the quality of the original manuscript. The funding support by the National Science Council in Taiwan (under Contract No. NSC101-2221-E-002-123-MY2) is also gratefully acknowledged.

References

Abedini, A., Hahn, M., and Samadzadegan, F. (2008). “An investigation into the registration of LiDAR intensity data and aerial images using SIFT approach.” Int. Arch. Photogramm., Remote Sens., and Spatial Inf. Sci., 37(B1), 169–176.
American Society for Photogrammetry and Remote Sensing. (2011). LAS specification (version 1.4-R12), Bethesda, MD.
Bairagi, B. K., Das, S. C., Chatterjee, A., and Tudu, B. (2012). “Expressions invariant face recognition using SURF and Gabor features.” Proc., 3rd Int. Conf. on Emerging Applications of Information Technology (EAIT), Indian Statistical Institute, Kolkata, India, 170–173.
Barazzetti, L., and Scaioni, M. (2010). “Development and implementation of image-based algorithms for measurement of deformations in material testing.” Sensors, 10(8), 7469–7495.
Baruch, A., and Filin, S. (2011). “Detection of gullies in roughly textured terrain using airborne laser scanning data.” ISPRS J. Photogramm. Remote Sens., 66(5), 564–578.
Bay, H., Ess, A., Tuytelaars, T., and Van Gool, L. (2008). “SURF: Speed-up robust features.” Comput. Vis. Image Underst., 110(3), 346–359.
Böhm, J., and Becker, S. (2007). “Automatic marker-free registration of terrestrial laser scans using reflectance features.” Proc., 8th Conf. on Optical 3D Measurement Technique, Swiss Federal Institute of Technology (ETH), Zurich, Switzerland, 338–344.
Brook, A., and Ben-Dor, E. (2011). “Automatic registration of airborne and spaceborne images by topography map matching with SURF processor algorithm.” Remote Sens., 3(1), 65–82.
Cossio, T. K., Slatton, K. C., Carter, W. E., Shrestha, K. Y., and Harding, D. (2010). “Predicting small target detection performance of low-SNR airborne LiDAR.” IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., 3(4), 672–688.
Gonzalez, R. C., and Woods, R. E. (2002). Digital image processing, 2nd Ed., Prentice Hall, Upper Saddle River, NJ.
Han, J. Y. (2010). “A non-iterative approach for the quick alignment of multi-station unregistered LiDAR point clouds.” IEEE Geosci. Remote Sens. Lett., 7(4), 727–730.
Han, J. Y., Chou, J. Y., and Ko, Y. Y. (2012). “Closed-form solution for relative rotations between image pairs using normal vectors of epipolar planes.” J. Surv. Eng., 138(1), 25–30.
Han, J. Y., van Gelder, B. H. W., and Lin, S. L. (2010). “Rotation- and translation-free estimations of symmetric, rank-two symmetric tensors with a case study in LiDAR surveying.” J. Surv. Eng., 136(1), 23–28.
Henning, J. G., and Radtke, P. J. (2008). “Multiview range-image registration for forested scenes using explicitly-matched tie points estimated from natural surfaces.” ISPRS J. Photogramm. Remote Sens., 63(1), 68–83.
Horn, B. K. P. (1990). “Relative orientation.” Int. J. Comput. Vis., 4(1), 59–78.
Kirchhof, M., and Stilla, U. (2006). “Detection of moving objects in airborne thermal videos.” ISPRS J. Photogramm. Remote Sens., 61(3–4), 187–196.
Lowe, D. G. (1999). “Object recognition from local scale-invariant features.” Proc., 7th IEEE Int. Conf. on Computer Vision, IEEE, Washington, DC, 1150–1157.
Lowe, D. G. (2004). “Distinct image features from scale-invariant keypoints.” Int. J. Comput. Vis., 60(2), 91–110.
Majji, M., Flewelling, B., Macomber, B., Junkins, J. L., Katake, A. B., and Bang, H. (2010). “Registration of LiDAR point clouds using image features.” Proc., ASPRS 2010 Annual Conf., American Society of Photogrammetry and Remote Sensing, Bethesda, MD (CD-ROM).
McManus, C., Furgale, P., and Barfoot, T. D. (2011). “Towards appearance-based methods for lidar sensors.” Proc., 2011 IEEE Int. Conf. on Robotics and Automation (ICRA), IEEE, Washington, DC, 1930–1935.
Mikhail, E. M., Bethel, J. S., and McGlone, J. C. (2001). Introduction to modern photogrammetry, Wiley, New York.
Newman, P., et al. (2011). “Describing, navigating and recognising urban spaces—Building an end-to-end SLAM system.” Proc., 13th Int. Symp. Robotics Research, Springer, Berlin, 237–253.
Schenk, T. (1999). Digital photogrammetry: Background, fundamentals, automatic orientation procedures, TerraScience, Laurelville, OH.
Shan, J., and Toth, C. K. (2008). Topographic laser ranging and scanning: Principles and processing, CRC Press, Boca Raton, FL.
Song, Z. L., and Zhang, J. (2010). “Remote sensing image registration based on retrofitted SURF algorithm and trajectories generated from Lissajous figures.” IEEE Geosci. Remote Sens. Lett., 7(3), 491–495.
Stewart, J. P., et al. (2009). “Use of airborne and terrestrial lidar to detect ground displacement hazards to water systems.” J. Surv. Eng., 135(3), 113–124.
Vosselman, G., and Maas, H. G. (2010). Airborne and terrestrial laser scanning, Whittles, Caithness, U.K.
Walker, J. S. (1999). A primer on wavelets and their scientific applications, CRC Press, Boca Raton, FL.
Yao, W., Hinz, S., and Stilla, U. (2011). “Extraction and motion estimation of vehicles in single-pass airborne LiDAR data towards urban traffic analysis.” ISPRS J. Photogramm. Remote Sens., 66(3), 260–271.
Zoller+Fröhlich. (2012). “Brochure Z+F Imager 5010.” 〈http://www.zf-laser.com/fileadmin/editor/Broschueren/Z_F_IMAGER_5010_E_FINAL_kompr.pdf〉 (Nov. 7, 2012).

Information & Authors

Information

Published In

Go to Journal of Surveying Engineering
Journal of Surveying Engineering
Volume 139Issue 3August 2013
Pages: 135 - 142

History

Received: Nov 9, 2012
Accepted: Mar 4, 2013
Published online: Mar 6, 2013
Published in print: Aug 1, 2013

Permissions

Request permissions for this article.

Authors

Affiliations

Jen-Yu Han, M.ASCE [email protected]
Associate Professor, Dept. of Civil Engineering, National Taiwan Univ., Taipei 106, Taiwan (corresponding author). E-mail: [email protected]
Nei-Hao Perng [email protected]
Graduate Student, Dept. of Civil Engineering, National Taiwan Univ., Taipei 106, Taiwan. E-mail: [email protected]
Yan-Ting Lin [email protected]
Graduate Student, Dept. of Civil Engineering, National Taiwan Univ., Taipei 106, Taiwan. E-mail: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share