Appearance-Based Gaze Estimation for Driver Monitoring

Soodeh Nikan, Devesh Upadhyay
Proceedings of The 1st Gaze Meets ML workshop, PMLR 210:127-139, 2023.

Abstract

Driver inattention is a leading cause of road accidents through its impact on reaction time in the face of incidents. In the case of Level-3 (L3) vehicles, inattention adversely impacts the quality of driver take over and therefore the safe performance of L3 vehicles. There is a high correlation between a driver’s visual attention and eye movement. Gaze angle is an excellent surrogate for assessing driver attention zones, in both cabin interior and on-road scenarios. We propose appearance-based gaze estimation approaches using convolutional neural networks (CNNs) to estimate gaze angle directly from eye images and also from eye landmark coordinates. The goal is to improve learning by utilizing synthetic data with more accurate annotations. Performance analysis shows that our proposed landmark-based model, trained synthetically, is capable of predicting gaze angle in the real data with a reasonable angular error. In addition, we discuss evaluation metrics are application specific and there is a crucial requirement for a more reliable assessment metric rather than common mean angular error to measure the driver’s gaze direction in L3 autonomy for a control takeover request at a proper time corresponding to the driver’s attention focus to avoid ambiguities.

Cite this Paper


BibTeX
@InProceedings{pmlr-v210-nikan23a, title = {Appearance-Based Gaze Estimation for Driver Monitoring}, author = {Nikan, Soodeh and Upadhyay, Devesh}, booktitle = {Proceedings of The 1st Gaze Meets ML workshop}, pages = {127--139}, year = {2023}, editor = {Lourentzou, Ismini and Wu, Joy and Kashyap, Satyananda and Karargyris, Alexandros and Celi, Leo Anthony and Kawas, Ban and Talathi, Sachin}, volume = {210}, series = {Proceedings of Machine Learning Research}, month = {03 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v210/nikan23a/nikan23a.pdf}, url = {https://proceedings.mlr.press/v210/nikan23a.html}, abstract = {Driver inattention is a leading cause of road accidents through its impact on reaction time in the face of incidents. In the case of Level-3 (L3) vehicles, inattention adversely impacts the quality of driver take over and therefore the safe performance of L3 vehicles. There is a high correlation between a driver’s visual attention and eye movement. Gaze angle is an excellent surrogate for assessing driver attention zones, in both cabin interior and on-road scenarios. We propose appearance-based gaze estimation approaches using convolutional neural networks (CNNs) to estimate gaze angle directly from eye images and also from eye landmark coordinates. The goal is to improve learning by utilizing synthetic data with more accurate annotations. Performance analysis shows that our proposed landmark-based model, trained synthetically, is capable of predicting gaze angle in the real data with a reasonable angular error. In addition, we discuss evaluation metrics are application specific and there is a crucial requirement for a more reliable assessment metric rather than common mean angular error to measure the driver’s gaze direction in L3 autonomy for a control takeover request at a proper time corresponding to the driver’s attention focus to avoid ambiguities.} }
Endnote
%0 Conference Paper %T Appearance-Based Gaze Estimation for Driver Monitoring %A Soodeh Nikan %A Devesh Upadhyay %B Proceedings of The 1st Gaze Meets ML workshop %C Proceedings of Machine Learning Research %D 2023 %E Ismini Lourentzou %E Joy Wu %E Satyananda Kashyap %E Alexandros Karargyris %E Leo Anthony Celi %E Ban Kawas %E Sachin Talathi %F pmlr-v210-nikan23a %I PMLR %P 127--139 %U https://proceedings.mlr.press/v210/nikan23a.html %V 210 %X Driver inattention is a leading cause of road accidents through its impact on reaction time in the face of incidents. In the case of Level-3 (L3) vehicles, inattention adversely impacts the quality of driver take over and therefore the safe performance of L3 vehicles. There is a high correlation between a driver’s visual attention and eye movement. Gaze angle is an excellent surrogate for assessing driver attention zones, in both cabin interior and on-road scenarios. We propose appearance-based gaze estimation approaches using convolutional neural networks (CNNs) to estimate gaze angle directly from eye images and also from eye landmark coordinates. The goal is to improve learning by utilizing synthetic data with more accurate annotations. Performance analysis shows that our proposed landmark-based model, trained synthetically, is capable of predicting gaze angle in the real data with a reasonable angular error. In addition, we discuss evaluation metrics are application specific and there is a crucial requirement for a more reliable assessment metric rather than common mean angular error to measure the driver’s gaze direction in L3 autonomy for a control takeover request at a proper time corresponding to the driver’s attention focus to avoid ambiguities.
APA
Nikan, S. & Upadhyay, D.. (2023). Appearance-Based Gaze Estimation for Driver Monitoring. Proceedings of The 1st Gaze Meets ML workshop, in Proceedings of Machine Learning Research 210:127-139 Available from https://proceedings.mlr.press/v210/nikan23a.html.

Related Material