Technical Papers
Oct 27, 2023

Driver Emotion Recognition Involving Multimodal Signals: Electrophysiological Response, Nasal-Tip Temperature, and Vehicle Behavior

Publication: Journal of Transportation Engineering, Part A: Systems
Volume 150, Issue 1

Abstract

Accurate driver emotion recognition is one of the key challenges in the construction of an intelligent vehicle safety assistant system. In this paper, we conduct a driving simulator study on driver emotion recognition. Taking the car-following scene as an example, the multimodal parameters of a driver in the five emotional states of neutral, joy, fear, sadness, and anger are obtained from the emotion induction experiment and the simulated driving experiment. Wavelet denoising and debase processing are used to reduce the influence of signal noise and the individual differences between drivers. The statistical domain and the time-frequency domain features of the electrophysiological response signals, nasal-tip temperature signals, and vehicle behavior signals are analyzed. The factor analysis method is used to extract and reduce the feature parameters, and the driver’s emotion recognition model is established based on machine learning methods such as random forest (RF), K-nearest-neighbor (KNN), and extreme gradient boosting (XGBoost). Through the verification and the comparison of different modalities and different modality combinations with different machine learning methods, the RF model, based on the feature combination of three types of modal data, has the best model recognition effect. The research results can provide a theoretical basis for driver emotion recognition of intelligent vehicles and have positive significance for promoting the development of human-computer interaction (HCI) systems of intelligent vehicles and improving road traffic safety.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

Acknowledgments

We thank all participants who took part in our study as well as grant funding from the National Natural Science Foundation Project, (Grant No. 51905224).
Author contributions: Jie Ni: conceptualization, methodology, data curation, formal analysis, methodology, resources, writing - original draft, writing - review and editing, project administration. Wanying Xie: conceptualization, data curation, formal analysis, writing - original draft, writing - review and editing. Jike Zhang: conceptualization, resources, writing - original draft. Yiping Liu: data curation, writing - original draft, writing - review and editing. Yugu Wan: writing - original draft, writing - review and editing. Huimin Ge: resources, writing - review and editing.

References

Agrawal, U., S. Giripunje, and P. Bajaj. 2013. “Emotion and gesture recognition with soft computing tool for drivers assistance system in human centered transportation.” In Proc., 2013 IEEE Int. Conf. on Systems, Man, and Cybernetics, 4612–4616. New York: IEEE.
Bansal, S., and P. Nagar. 2015. “Emotion recognition from facial expression based on bezier curve.” Int. J. Adv. Inf. Technol. 5 (3/4/5/6): 1–7. https://doi.org/10.5121/ijait.2015.5601.
Cowen, A. S., D. Keltner, F. Schroff, B. Jou, H. Adam, and G. Prasad. 2021. “Sixteen facial expressions occur in similar contexts worldwide.” Nature 589 (7841): 251–257. https://doi.org/10.1038/s41586-020-3037-7.
El-Amin, A., A. Attia, O. Hammad, O. Nasr, O. Ghozlan, R. Raouf, and S. Eldawlatly. 2019. “Brain-in-car: A brain activity-based emotion recognition embedded system for automotive.” In Proc., 2019 IEEE Int. Conf. on Vehicular Electronics and Safety (ICVES), 1–5. New York: IEEE.
Habibifar, N., and H. Salmanzadeh. 2022. “Improving driving safety by detecting negative emotions with biological signals: Which is the best?” Transp. Res. Rec. 2676 (2): 334–349. https://doi.org/10.1177/03611981211041594.
He, G., and X. Qiao. 2021. “GSR signal emotion recognition based on tree model machine learning.” [In Chinese.] J. Test Meas. Technol. 35 (6): 508–514. https://doi.org/10.3969/j.issn.1671-7449.2021.06.008.
Jeong, M., and B. C. Ko. 2018. “Driver’s facial expression recognition in real-time for safe driving.” Sensors 18 (12): 4270. https://doi.org/10.3390/s18124270.
Kim, S., G. H. An, and S. J. Kang. 2017. “Facial expression recognition system using machine learning.” In Proc., 2017 Int. SOC design Conf. (ISOCC), 266–267. New York: IEEE.
Ko, B. C. 2018. “A brief review of facial emotion recognition based on visual information.” Sensors 18 (2): 401. https://doi.org/10.3390/s18020401.
Liu, Y., and X. Wang. 2020a. “The analysis of driver’s behavioral tendency under different emotional states based on a Bayesian network.” IEEE Trans. Affective Comput. 14 (1): 165–177. https://doi.org/10.1109/TAFFC.2020.3027720.
Liu, Y., and X. Wang. 2020b. “Differences in driving intention transitions caused by driver’s emotion evolutions.” Int. J. Environ. Res. Public Health 17 (19): 6962. https://doi.org/10.3390/ijerph17196962.
Machajdik, J., and A. Hanbury. 2010. “Affective image classification using features inspired by psychology and art theory.” In Proc., 18th ACM Int. Conf. on Multimedia, 83–92. New York: Association for Computing Machinery.
Murugappan, M., A. M. Mutawa, S. Sruthi, A. Hassouneh, A. Abdulsalam, S. Jerritta, and R. Ranjana. 2020. “Facial expression classification using KNN and decision tree classifiers.” In Proc., 2020 4th Int. Conf. on Computer, Communication and Signal Processing (ICCCSP), 1–6. New York: IEEE.
Ooi, J. S. K., S. A. Ahmad, Y. Z. Chong, S. H. M. Ali, G. Ai, and H. Wagatsuma. 2016. “Driver emotion recognition framework based on electrodermal activity measurements during simulated driving conditions.” In Proc,. 2016 IEEE EMBS Conf. on Biomedical Engineering and Sciences (IECBES), 365–369. New York: IEEE.
Park, B. J., C. Yoon, E. H. Jang, and D. H. Kim. 2017. “Physiological signals and recognition of negative emotions.” In Proc., 2017 Int. Conf. on Inf. and Communication Technology Convergence (ICTC), 1074–1076. New York: IEEE.
Paschero, M., G. Del Vescovo, L. Benucci, A. Rizzi, M. Santello, G. Fabbri, and F. F. Mascioli. 2012. “A real time classifier for emotion and stress recognition in a vehicle driver.” In Proc., 2012 IEEE Int. Symp. on Industrial Electronics, 1690–1695. New York: IEEE.
Pusarla, N., A. Singh, and S. Tripathi. 2020. “Ensemble algorithms for EEG based emotion recognition.” In Proc., 2020 National Conf. on Communications (NCC), 1–4. New York: IEEE.
Shafaei, S., T. Hacizade, and A. Knoll. 2018. “Integration of driver behavior into emotion recognition systems: A preliminary study on steering wheel and vehicle acceleration.” In Proc., Asian Conf. on Computer Vision, 386–401. New York: Springer.
Shu, L., J. Xie, M. Yang, Z. Li, Z. Li, D. Liao, and X. Yang. 2018. “A review of emotion recognition using physiological signals.” Sensors 18 (7): 2074. https://doi.org/10.3390/s18072074.
Verma, B., and A. Choudhary. 2018. “Deep learning based real-time driver emotion monitoring.” In Proc., 2018 IEEE Int. Conf. on Vehicular Electronics and Safety (ICVES), 1–6. New York: IEEE.
Wang, P. Z. 2020. Study on multimodal recognition method of driver’s anger emotion and mechanism of driving risk under anger emotion. Chongqing, China: Chongqing Univ.
Wang, X., Y. Liu, F. Wang, J. Wang, L. Liu, and J. Wang. 2019. “‘Feature extraction and dynamic identification of drivers’ emotions.” Transp. Res. Part F: Traffic Psychol. Behav. 62 (Apr): 175–191. https://doi.org/10.1016/j.trf.2019.01.002.
Wang, Y. F., W. L. Ma, W. K. Wang, and K. Xiao. 2022. “Online recognition model construction method of driver emotion based on physiological feature mapping.” [In Chinese.] Chin. J. Mech. Eng. 58 (20): 379–390. https://doi.org/10.3901/JME.2022.20.379.
Wei, C. Z. 2013. “Stress emotion recognition based on RSP and EMG signals.” Adv. Mater. Res. 709 (Jun): 827–831. https://doi.org/10.4028/www.scientific.net/AMR.709.827.
Xiao, H., W. Li, G. Zeng, Y. Wu, J. Xue, J. Zhang, and G. Guo. 2022. “On-road driver emotion recognition using facial expression.” Appl. Sci. 12 (2): 807. https://doi.org/10.3390/app12020807.
Zepf, S., J. Hernandez, A. Schmitt, W. Minker, and R. W. Picard. 2020. “Driver emotion recognition for intelligent vehicles: A survey.” ACM Comput. Surv. 53 (3): 1–30. https://doi.org/10.1145/3388790.
Zhang, J., Z. Yin, P. Chen, and S. Nichele. 2020. “Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review.” Inf. Fusion 59 (Jul): 103–126. https://doi.org/10.1016/j.inffus.2020.01.011.
Zhang, Y.-D., Z.-J. Yang, H.-M. Lu, X.-X. Zhou, P. Phillips, Q.-M. Liu, and S.-H. Wang. 2016. “Facial emotion recognition based on biorthogonal wavelet entropy, fuzzy support vector machine, and stratified cross validation.” IEEE Access 4 (Nov): 8375–8385. https://doi.org/10.1109/ACCESS.2016.2628407.
Zhao, S., H. Yao, Y. Gao, R. Ji, and G. Ding. 2016. “Continuous probability distribution prediction of image emotions via multitask shared sparse regression.” IEEE Trans. Multimedia 19 (3): 632–645. https://doi.org/10.1109/TMM.2016.2617741.
Zheng, W.-L., W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki. 2018. “Emotionmeter: A multimodal framework for recognizing human emotions.” IEEE Trans. Cybern. 49 (3): 1110–1122. https://doi.org/10.1109/TCYB.2018.2797176.
Zhu, Y. Q., and X. L. Yang. 2008. “Several new methods based on wavelet threshold denoising.” [In Chinese.] Electron. Test 232 (2): 18–22. https://doi.org/10.3969/j.issn.1000-8519.2008.02.005.

Information & Authors

Information

Published In

Go to Journal of Transportation Engineering, Part A: Systems
Journal of Transportation Engineering, Part A: Systems
Volume 150Issue 1January 2024

History

Received: Nov 15, 2022
Accepted: Aug 22, 2023
Published online: Oct 27, 2023
Published in print: Jan 1, 2024
Discussion open until: Mar 27, 2024

Permissions

Request permissions for this article.

Authors

Affiliations

Associate Professor, Dept. of Automobile and Traffic Engineering, Jiangsu Univ., Zhenjiang, Jiangsu 212013, China (corresponding author). Email: [email protected]
Wanying Xie [email protected]
Graduate Student, Dept. of Automobile and Traffic Engineering, Jiangsu Univ., Zhenjiang, Jiangsu 212013, China. Email: [email protected]
Undergraduate Student, Dept. of Automobile and Traffic Engineering, Jiangsu Univ., Zhenjiang, Jiangsu 212013, China. Email: [email protected]
Graduate Student, Dept. of Transportation and Logistics Engineering, Wuhan Univ. of Technology, Wuhan, Hubei 430063, China. Email: [email protected]
Undergraduate Student, Dept. of Automobile and Traffic Engineering, Jiangsu Univ., Zhenjiang, Jiangsu 212013, China. Email: [email protected]
Associate Professor, Dept. of Automobile and Traffic Engineering, Jiangsu Univ., Zhenjiang, Jiangsu 212013, China. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share