Chapter
Aug 31, 2022

Traffic Sign Detection and Recognition for Autonomous Driving in Virtual Simulation Environment

Publication: International Conference on Transportation and Development 2022

ABSTRACT

This study developed a traffic sign detection and recognition algorithm based on the RetinaNet. Two main aspects were revised to improve the detection of traffic signs: image cropping to address the issue of large image and small traffic signs and more anchors with various scales to detect traffic signs with different sizes and shapes. The proposed algorithm was trained and tested in a series of autonomous driving front-view images in a virtual simulation environment. Results show that the algorithm performed well under good illumination and weather conditions. The drawbacks are that it sometimes failed to detect objects under bad weather conditions like snow and failed to distinguish speed limit signs with different limit values.

Get full access to this article

View all available purchase options and get full access to this chapter.

REFERENCES

Arcos-Garcia, A., Álvarez-García, J. A., and Soria-Morillo, L. M. (2018). “Evaluationof deep neural networks for traffic sign detection systems.” Neurocomputing, 316,332–344.
Chen, S., Li, J., Yao, C., Hou, W., Qin, S., Jin, W., and Tang, X. (2019). “Dubox:No-prior box objection detection via residual dual scale detectors.” arXiv preprintarXiv:1904.06883.
Dai, J., Li, Y., He, K., and Sun, J. (2016). “R-fcn: Object detection via region-basedfully convolutional networks.” Advances in neural information processing systems,379–387.
He, K., Gkioxari, G., Dollár, P., and Girshick, R. (2017). “Mask r-cnn.” Proceedings of the IEEE international conference on computer vision, 2961–2969.
He, K., Zhang, X., Ren, S., and Sun, J. (2016). “Deep residual learning for imagerecognition.” Proceedings of the IEEE conference on computer vision and patternrecognition, 770–778.
Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., and Adam, H. (2017). “Mobilenets: Efficient convolutional neural networks for mobile vision applications.” arXiv preprint arXiv:1704.04861.
Lin, T.-Y., Goyal, P., Girshick, R., He, K., and Dollár, P. (2017). “Focal loss for denseobject detection.” Proceedings of the IEEE international conference on computervision, 2980–2988.
Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., and Berg, A. C. (2016). “Ssd: Single shot multibox detector.” European conference on computervision, Springer, 21–37.
Redmon, J., and Farhadi, A. (2018). “Yolov3: An incremental improvement.” arXivpreprint arXiv:1804.02767.
Ren, S., He, K., Girshick, R., and Sun, J. (2015). “Faster r-cnn: Towards real-time object detection with region proposal networks.” Advances in neural information processing systems, 91–99.
Szegedy, C., Ioffe, S., Vanhoucke, V., and Alemi, A. A. (2017). “Inception-v4, inception-resnet and the impact of residual connections on learning.” Thirty-First AAAI Con-ference on Artificial Intelligence.

Information & Authors

Information

Published In

Go to International Conference on Transportation and Development 2022
International Conference on Transportation and Development 2022
Pages: 12 - 18

History

Published online: Aug 31, 2022

Permissions

Request permissions for this article.

Authors

Affiliations

Meixin Zhu
1Dept. of Civil and Environmental Engineering, Univ. of Washington, Seattle, WA
Hao (Frank) Yang
2Dept. of Civil and Environmental Engineering, Univ. of Washington, Seattle, WA
Zhiyong Cui
3Dept. of Civil and Environmental Engineering, Univ. of Washington, Seattle, WA
Yinhai Wang [email protected]
4Dept. of Civil and Environmental Engineering, Univ. of Washington, Seattle, WA. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$80.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$80.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share