Technical Papers
May 15, 2018

Matching Construction Workers across Views for Automated 3D Vision Tracking On-Site

Publication: Journal of Construction Engineering and Management
Volume 144, Issue 7

Abstract

Computer vision–based tracking methods are used to track construction resources for productivity and safety purposes. This type of tracking requires that targets be accurately matched across multiple camera views to obtain a three-dimensional (3D) trajectory out of two or more two-dimensional (2D) trajectories. This matching is straightforward when it involves easily distinguishable targets in uncluttered scenes. This can be challenging in industrial scenes such as construction sites due to congestion, occlusions, and workers in greatly similar high-visibility apparel. This paper proposes a novel vision-based method that addresses all these issues. It uses as input the output of a 2D vision-based tracking method and searches for potential matches in three sequential steps. It terminates only when a positive match is found. The first step returns the strongest candidate by correlating a segment of workers’ past 2D trajectories. The second uses geometric restrictions, whereas the third correlates color intensity values. The proposed method features a promising performance of 97% precision, 98% recall, and 95% accuracy.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

Data generated by the authors or analyzed during the study are available at: https://doi.org/10.5281/zenodo.839674. Information about the Journal’s data sharing policy can be found here: http://ascelibrary.org/doi/10.1061/%28ASCE%29CO.1943-7862.0001263.

Acknowledgments

This research is an ICASE studentship award, supported by EPSRC and Laing O’Rourke PLC under Grant No. 13440016. Any opinions, findings, and conclusions or recommendations included in this paper are those of the authors and do not necessarily reflect the views of organizations and people mentioned previously.

References

An, L., X. Chen, and S. Yang. 2016. “Person re-identification via hypergraph-based matching.” Neurocomputing 182 (Mar): 247–254. https://doi.org/10.1016/j.neucom.2015.12.029.
Axis Communications. 2016. “More than face value/facial Recognition in video surveillance.” Accessed April 10, 2018. http://www.axis.com/ae/en/solutions-by-Application/facial-Recognition.
Bay, H., T. Tuytelaars, and L. Van Gool. 2008. “SURF: Speeded up robust features.” Comput. Vision Image Understanding 110 (3): 346–359. https://doi.org/10.1016/j.cviu.2007.09.014.
Chawla, K., G. Robins, and L. Zhang. 2010. “Object localization using RFID.” In Proc., IEEE 5th Int. Symp. on Wireless Pervasive Computing, 301–306. New York: IEEE.
Cheikh, F. A., S. K. Saha, V. Rudakova, and P. Wang. 2012. “Multi-people tracking across multiple cameras.” Int. J. New Comput. Archit. Their Appl. 2 (1): 23–33.
Chen, K., C. Lai, and Y. Hung. 2008. “An adaptive learning method for target tracking across multiple cameras.” In Proc., IEEE Conf. on Computer Vision and Pattern Recognition. New York: IEEE.
Choi, K., and Y. Seo. 2011. “Automatic initialization for 3D soccer player tracking.” Pattern Recognit. Lett. 32 (9): 1274–1282. https://doi.org/10.1016/j.patrec.2011.03.009.
Chyronhego.com. 2016. “TRACAB optical tracking.” Accessed December 9, 2016. http://chyronhego.com/sports-Data/tracab.
Dalal, N., and B. Triggs. 2005. “Histograms of oriented gradients for human detection.” In Vol. 1 of Proc., IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR’05), 886–893. New York: IEEE.
Facefirst. 2016. “Best-in-class surveillance through face recognition technology.” Accessed April 10, 2018. https://www.facefirst.com/solutions/surveillance-Face-recognition/.
Fathi, H., and I. Brilakis. 2014. “Multistep explicit stereo camera calibration approach to improve euclidean accuracy of large-scale 3D reconstruction.” J. Comput. Civ. Eng. 30 (1): 04014120. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000454.
Gilbert, A., and R. Bowden. 2006. “Tracking objects across cameras by incrementally learning inter-camera colour calibration and patterns of activity.” In Computer Vision—ECCV 2006, Part II, 125–136. Berlin: Springer.
Gray, D., and H. Tao. 2008. “Viewpoint invariant pedestrian recognition with an ensemble of localized features.” In Computer Vision—ECCV, Part II, 262–275. Berlin: Springer.
Han, M., and I. Kim. 2013. “Hue modeling for object tracking in multiple non-overlapping cameras.” In MIWAI 2013: Multi-disciplinary Trends in Artificial Intelligence, 69–78. Berlin: Springer.
Hartley, R. I. 1997. “In defence of the 8-point algorithm.” IEEE Trans. Pattern Anal. Mach. Intell. 19 (6): 1064–1070. https://doi.org/10.1109/34.601246.
Joglekar, J., and S. S. Gedam. 2010. “Image matching with SIFT features: A probabilistic approach.” In Vol. 38 of Proc., Int. Society for Photogrammetry and Remote Sensing. Saint-Mandé, France: International Society for Photogrammetry and Remote Sensing.
Kong, H., H. C. Akakin, and S. E. Sarma. 2013. “A generalized Laplacian of Gaussian filter for blob detection and its applications.” IEEE Trans. Cybern. 43 (6): 1719–1733. https://doi.org/10.1109/TSMCB.2012.2228639.
Konstantinou, E., and I. Brilakis. 2016. “3D matching of resource vision tracking trajectories.” In Construction Research Congress 2016: Old and New Construction Technologies Converge in Historic San Juan. Reston, VA: ASCE.
Lee, L., R. Romano, and G. Stein. 2000. “Monitoring activities from multiple video steams: Establishing a common coordinate frame.” IEEE Trans. Pattern Anal. Mach. Intell. 22 (8): 758–767. https://doi.org/10.1109/34.868678.
Lee, Y., M. Park, and I. Brilakis. 2016. “Entity matching across stereo cameras for tracking construction workers.” In Proc., 33rd Int. Symp. on Automation and Robotics in Construction (ISARC), 669–677. Auburn, AL: International Association for Automation & Robotics in Construction.
Lin, X. J., Q. X. Wu, X. Wang, Z. Q. Zhuo, and G. R. Zhang. 2016. “People recognition in multi-cameras using the visual color processing mechanism.” Neurocomputing 188 (May): 71–81. https://doi.org/10.1016/j.neucom.2014.10.https://doi.org/107.
Lowe, D. G. 2004. “Distinctive image features from scale-invariant keypoints.” Int. J. Comput. Vision 60 (2): 91–110. https://doi.org/10.1023/B:VISI.0000029664.99615.94.
Mazzeo, P. L., and P. Spagnolo. 2011. “Object tracking in multiple cameras with disjoint views.” In Object tracking, edited by H. Goszczynska. London: InTech.
Nyström, D., and S. Per. 2009. “Productivity increase valve and pipe assembly An investigation of how to improve the manufacturing process in a large variant production environment.” M.S. thesis, Chalmers Univ. of Technology.
Park, M. W., A. Makhmalbaf, and I. Brilakis. 2011. “Comparative study of vision tracking methods for tracking of construction site resources.” Autom. Constr. 20 (7): 905–915. https://doi.org/10.1016/j.autcon.2011.03.007.
Peng, P., Y. Tian, Y. Wang, J. Li, and T. Huang. 2015. “Robust multiple cameras pedestrian detection with multi-view Bayesian network.” Pattern Recognit. 48 (5): 1760–1772. https://doi.org/10.1016/j.patcog.2014.12.004.
Projectfine.eu. 2010. “Free-viewpoint Immersive Networked Experience/ D3.1 Specification of FINE’s camera architecture.” Accessed April 20, 2017. http://www.projectfine.eu/assets/deliverables/D3.1.pdf.
Ross, D. A., J. Lim, R.-S. Lin, and M.-H. Yang. 2008. “Incremental learning for robust visual tracking.” Int. J. Comput. Vision 77 (1–3): 125–141. https://doi.org/10.1007/s11263-007-0075-7.
Shah, J. H., M. Lin, and Z. Chen. 2016. “Multi-camera handoff for person re-identification.” Neurocomputing 191 (May): 238–248. https://doi.org/10.1016/j.neucom.2016.01.037.
Smith, L. I. 2002. “A tutorial on principal components analysis introduction.” Accessed April 11, 2018. http://www.mendeley.com/research/computational-genome-analysis-an-introduction-statistics-for-biology-and-healt/.
Suolan, L., W. Jia, and S. Changyin. 2016. “Targets association across multiple cameras by learning transfer models.” Int. J. Sig. Process. Image Process. Pattern Recognit. 9 (1): 185–196. https://doi.org/10.14257/ijsip.2016.9.1.17.
Teixeira, L. F., and L. Corte-Real. 2009. “Video object matching across multiple independent views using local descriptors and adaptive learning.” Pattern Recognit. 30 (2): 157–167. https://doi.org/10.1016/j.patrec.2008.04.001.
Teizer, J., and P. Vela. 2009. “Personnel tracking on construction sites using video cameras.” Adv. Eng. Inf. 23 (4): 452–462. https://doi.org/10.1016/j.aei.2009.06.011.
Wang, H., X. Wang, J. Zheng, J. R. Deller, H. Peng, L. Zhu, and H. Bao. 2014. “Video object matching across multiple non-overlapping camera views based on multi-feature fusion and incremental learning.” Pattern Recognit. 47 (12): 3841–3851. https://doi.org/10.1016/j.patcog.2014.06.019.
Wang, X. 2013. “Intelligent multi-camera video surveillance: A review.” Pattern Recognit. Lett. 34 (1): 3–19. https://doi.org/10.1016/j.patrec.2012.07.005.
Yang, D. K., P. C. Chung, and C. R. Huang. 2014. “Unsupervised path modeling across multiple cameras with disjoint views for foreground object tracking.” In Vol. 2 of Int. Conf. on Information Science, Electronics and Electrical Engineering, 1161–1165. New York: IEEE.
Zhang, Z., and S. Member. 2000. “A flexible new technique for camera calibration.” IEEE Trans. Pattern Anal. Mach. Intell. 22 (11): 1330–1334. https://doi.org/10.1109/34.888718.
Zhou, Z., Y. Wang, and E. Khwang. 2016. “A framework for semantic people description in multi-camera surveillance systems.” Image Vis. Comput. 46: 29–46. https://doi.org/10.1016/j.imavis.2015.11.009.

Information & Authors

Information

Published In

Go to Journal of Construction Engineering and Management
Journal of Construction Engineering and Management
Volume 144Issue 7July 2018

History

Received: Aug 15, 2017
Accepted: Jan 3, 2018
Published online: May 15, 2018
Published in print: Jul 1, 2018
Discussion open until: Oct 15, 2018

Permissions

Request permissions for this article.

Authors

Affiliations

Eirini Konstantinou [email protected]
Ph.D. Candidate, Dept. of Engineering, Univ. of Cambridge, Trumptington St., Cambridge CB2 1PZ, UK (corresponding author). Email: [email protected]
Ioannis Brilakis, M.ASCE [email protected]
Laing O’Rourke Reader, Dept. of Engineering, Univ. of Cambridge, Trumptington St., Cambridge CB2 1PZ, UK. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share