ABSTRACT

The growing emphasis on smart cities and digital twins has heightened the significance of 3D geospatial information. However, 3D urban building modeling, considered a significant object of interest, typically involves time-consuming manual labor. Several studies have been conducted to reconstruct 3D urban buildings using multidimensional and multitemporal. However, several constraints, such as data availability, view angles, occlusions, and texture detail, affect the accuracy and completeness of 3D models. Furthermore, inconsistencies across different datasets pose difficulties in comparing results across studies and developing generalized methods. In this study, our objective aims to utilize neural rendering techniques to reconstruct a 3D model of buildings in urban areas using imagery captured from unmanned aerial vehicle (UAV). Neural radiance field (NeRF) synthesized a continuous scene of unknown direction for areas not covered by the UAV images. As a result, the proposed method provided a more comprehensive view to supplement the occlusions in the photogrammetry-based point cloud. The results were quantitatively evaluated and indicated that the proposed method could be a feasible complementary solution for photogrammetry-based 3D building modeling.

Get full access to this article

View all available purchase options and get full access to this chapter.

REFERENCES

Barron, J. T., Mildenhall, B., Tancik, M., Hedman, P., Martin-Brualla, R., and Srinivasan, P. P. (2021). Mip-nerf: A multiscale representation for anti-aliasing neural radiance fields. Proceedings of the IEEE/CVF International Conference on Computer Vision, 5855–5864.
Han, H. M., Chun, I. Y., and Hwang, S. S. (2021). Improvement of NeRF(Neural Radiance Field) Using Depth Information. The Institute of Electronics and Information Engineers. www.dbpia.co.kr.
KSGPC. (2020). A Study on the 3D Spatial Information Production Method for Digital Twin Foundation.
Lombardi, S., Simon, T., Saragih, J., Schwartz, G., Lehrmann, A., and Sheikh, Y. (2019). Neural volumes: Learning dynamic renderable volumes from images.
Mildenhall, B., Srinivasan, P. P., Tancik, M., Barron, J. T., Ramamoorthi, R., and Ng, R. (2021). Nerf: Representing scenes as neural radiance fields for view synthesis. Communications of the ACM, 65(1), 99–106.
MOLIT. (2021). We will manage our country more safely and conveniently with 3D building model digital twin.
Müller, T., Evans, A., Schied, C., and Keller, A. (2022). Instant neural graphics primitives with a multiresolution hash encoding. ACM Transactions on Graphics (ToG), 41(4), 1–15.
Schonberger, J. L., and Frahm, J.-M. (2016). Structure-from-motion revisited. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 4104–4113.
Sitzmann, V., Zollhöfer, M., and Wetzstein, G. (2019). Scene representation networks: Continuous 3d-structure-aware neural scene representations. Advances in Neural Information Processing Systems, 32.
Stanley, K. O. (2007). Compositional pattern producing networks: A novel abstraction of development. Genetic Programming and Evolvable Machines, 8, 131–162.
STRATEGIC MARKET RESEARCH. (2022). Digital Twin Market By Enterprise (Large Enterprises, Medium Enterprises, Small Enterprises), By Industry (Infrastructure, Retail, Agriculture, Oil and Gas, Aerospace, Telecommunication, Automotive and Transportation, Energy and Utilities, Healthcare, Other Industries), By Application (Product Design, Product Development, Inventory Management, Predictive Maintenance, Performance Monitoring, Business Optimization, Other Applications),.
Sun, S., and Salvaggio, C. (2013). Aerial 3D building detection and modeling from airborne LiDAR point clouds. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 6(3), 1440–1449.
Verbin, D., Hedman, P., Mildenhall, B., Zickler, T., Barron, J. T., and Srinivasan, P. P. (2022). Ref-nerf: Structured view-dependent appearance for neural radiance fields. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 5481–5490.
Wang, C., Wen, C., Dai, Y., Yu, S., and Liu, M. (2020). Urban 3D modeling with mobile laser scanning: a review. Virtual Reality & Intelligent Hardware, 2(3), 175–212.
Wang, R. (2013). 3D building modeling using images and LiDAR: A review. International Journal of Image and Data Fusion, 4(4), 273–292.
Xie, L., Hu, H., Zhu, Q., Li, X., Tang, S., Li, Y., Guo, R., Zhang, Y., and Wang, W. (2021). Combined rule-based and hypothesis-based method for building model reconstruction from photogrammetric point clouds. Remote Sensing, 13(6), 1107.
Zhou, T., Tucker, R., Flynn, J., Fyffe, G., and Snavely, N. (2018). Stereo magnification: Learning view synthesis using multiplane images.

Information & Authors

Information

Published In

Go to Computing in Civil Engineering 2023
Computing in Civil Engineering 2023
Pages: 34 - 41

History

Published online: Jan 25, 2024

Permissions

Request permissions for this article.

ASCE Technical Topics:

Authors

Affiliations

Cheolhwan Kim [email protected]
1Ph.D. Student, GRSLAB, Dept. of Civil and Environmental Engineering, Yonsei Univ., Seoul, Korea. Email: [email protected]
2Ph.D. Student, GRSLAB, Dept. of Civil and Environmental Engineering, Yonsei Univ., Seoul, Korea. Email: [email protected]
Wonjun Choi [email protected]
3Ph.D. Student, GRSLAB, Dept. of Civil and Environmental Engineering, Yonsei Univ., Seoul, Korea. Email: [email protected]
Youngmok Kwon [email protected]
4Ph.D. Student, GRSLAB, Dept. of Civil and Environmental Engineering, Yonsei Univ., Seoul, Korea. Email: [email protected]
Hong-Gyoo Sohn [email protected]
5Professor, GRSLAB, Dept. of Civil and Environmental Engineering, Yonsei Univ., Seoul, Korea. ORCID: https://orcid.org/0000-0003-1839-3431. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$198.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$198.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share