Hostname: page-component-84b7d79bbc-7nlkj Total loading time: 0 Render date: 2024-08-03T16:04:33.389Z Has data issue: false hasContentIssue false

A methodology to detect pilot perception of warning information by eye movement data and deep residual shrinkage networks

Published online by Cambridge University Press:  03 January 2023

C.-Q. Yan
Affiliation:
College of Civil Aviation, Nanjing University of Aeronautics & Astronautics, 211106 Nanjing, China
Y.-C. Sun*
Affiliation:
College of Civil Aviation, Nanjing University of Aeronautics & Astronautics, 211106 Nanjing, China
X. Zhang
Affiliation:
College of Civil Aviation, Nanjing University of Aeronautics & Astronautics, 211106 Nanjing, China
H.-Y. Mao
Affiliation:
China Aeronautical Radio Electronics Research Institute, 200241, Shanghai, China
J.-Y. Jiang
Affiliation:
College of Civil Aviation, Nanjing University of Aeronautics & Astronautics, 211106 Nanjing, China
*
*Correspondence author. Email: sunyc@nuaa.edu.cn

Abstract

This paper studied the use of eye movement data to form criteria for judging whether pilots perceive emergency information such as cockpit warnings. In the experiment, 12 subjects randomly encountered different warning information while flying a simulated helicopter, and their eye movement data were collected synchronously. Firstly, the importance of the eye movement features was calculated by ANOVA (analysis of variance). According to the sorting of the importance and the Euclidean distance of each eye movement feature, the warning information samples with different eye movement features were obtained. Secondly, the residual shrinkage network modules were added to CNN (convolutional neural network) to construct a DRSN (deep residual shrinkage networks) model. Finally, the processed warning information samples were used to train and test the DRSN model. In order to verify the superiority of this method, the DRSN model was compared with three machine learning models, namely SVM (support vector machine), RF (radom forest) and BPNN (backpropagation neural network). Among the four models, the DRSN model performed the best. When all eye movement features were selected, this model detected pilot perception of warning information with an average accuracy of 90.4%, of which the highest detection accuracy reached 96.4%. Experiments showed that the DRSN model had advantages in detecting pilot perception of warning information.

Type
Research Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Royal Aeronautical Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Wang, H., Jiang, N., Pan, T., Si, H. and Zou, W. Cognitive load identification of pilots based on physiological-psychological characteristics in complex environments, J Adv Transport, 2020. (l), pp 116. https://doi.org/10.1155/2020/5640784.Google Scholar
Endsley, M.R. Design and evaluation for situation awareness enhancement, Proc Human Factors Soc Ann Meet, 1988, 32, (2), pp 97101. https://doi.org/10.1177/154193128803200221.CrossRefGoogle Scholar
Sterling, R.D. Defense, security, and cockpit displays, Proceedings of SPIE - The International Society for Optical Engineering, 2004, p 5443.Google Scholar
Liggett, K.K. and Gallimore, J.J. An analysis of control reversal errors during unusual attitude recoveries using helmet-mounted display symbology, Aviat Space Environ Med, 2002, 73, (2), pp 102111.Google ScholarPubMed
Sherry, L., Fennell, K., Feary, M. and Polson, P. Human-computer interaction analysis of flight management system messages, J Aircr, 2012, 43, (5), pp 13721376.CrossRefGoogle Scholar
Su, H., Liu, Z., Cao, L., Dai, S. and Lou, Z. Effects of visuospatial working memory capacity on tasks of fighter aircraft instrument reading. Space Med Med Eng, 2012, 25, (06), pp 412416.Google Scholar
Li, W.C., Cao, J., Lin, J.H., Braithwaite, G. and Greaves, M. The evaluation of pilot’s first fixation and response time to different design of alerting messages, 14th International Conference on Engineering Psychology and Cognitive Ergonomics, 2017, pp 2131.CrossRefGoogle Scholar
Endsley, M.R. and Garland, D.J. Situation Awareness Analysis and Measurement. CRC Press, Boca Raton, USA, 2000.CrossRefGoogle Scholar
Li, W.C., Zhang, J., Minh, T.L., Cao, J. and Wang, L. Visual scan patterns reflect to human-computer interactions on processing different types of messages in the flight deck, Int J Ind Ergon, 2019, 72, pp 5460. https://doi.org/10.1016/j.ergon.2019.04.003.CrossRefGoogle Scholar
Kearney, P., Li, W.C. and Lin, J. The impact of alerting design on air traffic controllers’ response to conflict detection and resolution, Int J Ind Ergon, 2016, 56, pp 5158. https://doi.org/10.1016/j.ergon.2016.09.002.CrossRefGoogle Scholar
Wang, W.H., Hou, S.Y. and Jiang, XB. Application evaluation of self-explaining intersections based on visual information, International Conference on Transportation and Development 2020: Transportation Safety, 2020, pp 8394. https://doi.org/10.1061/9780784483145.008 CrossRefGoogle Scholar
Vidulich, M.A., Wickens, C.D., Tsang, P.S. and Flach, J.M. Information processing in aviation. In Salas, E. and Maurino, D. (Eds.), Human Factors in Aviation, London, UK: Elsevier, 2010, pp 175207. https://doi.org/10.1016/B978-0-12-374518-7.00007-9.CrossRefGoogle Scholar
Robinski, M. and Stein, M. Tracking visual scanning techniques in training simulation for helicopter landing, Journal of Eye Movement Research, 2013, 6, (2).CrossRefGoogle Scholar
Vsevolod, I.D., Lefrançois, O., Dehais, F. and Causse, M. The neuroergonomics of aircraft cockpits: the four stages of eye-tracking integration to enhance flight safety, Safety, 2018, 4, (1), p 8. https://doi.org/10.3390/safety4010008.Google Scholar
Zhang, L., Zhou, Q., Yin, Q. and Liu, Z. Assessment of pilots mental fatigue status with the eye movement features, International Conference on Applied Human Factors and Ergonomics, 2019, pp 146155. https://doi.org/10.1007/978-3-319-94334-3_16.CrossRefGoogle Scholar
Mohan, D.B., Jeevitha, S., Prabhakar, G., Saluja, K.S. and Biswas, P. Estimating pilots’ cognitive load from ocular parameters through simulation and in-flight studies, J Eye Movem Res, 2019, 12, (3).Google Scholar
Feng, S.A., Tl, B., Xw, A., Zl, C., Yz, A. and Xl, A. The influence of pilot’s attention allocation on instrument reading during take-off: the mediating effect of attention span. Appl Ergon, 2020, 90, p 103245. https://doi.org/10.1016/j.apergo.2020.103245.Google Scholar
De Winter, J.C.F., Eisma, Y.B., Cabrall, C.D.D., et al. Situation awareness based on eye movements in relation to the task environment. Cogn Tech Work, 2019, 21, pp 99111. https://doi.org/10.1007/s10111-018-0527-6.CrossRefGoogle Scholar
Allsop, J. and Gray, R. Flying under pressure: effects of anxiety on attention and gaze behavior in aviation, J Appl Res Mem Cognit, 2015, 3, (2), pp 6371. https://doi.org/10.1016/j.jarmac.2014.04.010.CrossRefGoogle Scholar
Bałaj, B., Lewkowicz, R., Francuz, P., et al. Spatial disorientation cue effects on gaze behaviour in pilots and non-pilots, Cogn Tech Work, 2019, 21, pp 473486. https://doi.org/10.1007/s10111-018-0534-7.CrossRefGoogle Scholar
Tichon, J.G., Wallis, G., Riek, S., et al. Physiological measurement of anxiety to evaluate performance in simulation training, Cogn Tech Work, 2014, 16, pp 203210. https://doi.org/10.1007/s10111-013-0257-8.CrossRefGoogle Scholar
Korek, W.T., Mendez, A., Asad, H.U., Li, W.C. and Lone, M. Understanding human behaviour in flight operation using eye-tracking technology. Eng Psychol Cognit Ergon, 2020, pp 304320.Google Scholar
Friedrich, M., Lee, S.Y., Bates, P., et al. The influence of training level on manual flight in connection to performance, scan pattern, and task load, Cogn Tech Work, 2021, 23, pp 715730. https://doi.org/10.1007/s10111-020-00663-8.CrossRefGoogle Scholar
Shen, Z., Zhang, L., Li, R., Hou, J., Liu, C. and Hu, W. The effects of color combinations, luminance contrast, and area ratio on icon visual search performance - sciencedirect. Displays, 2021, 67. https://doi.org/10.1016/j.displa.2021.101999.CrossRefGoogle Scholar
Yu, C., Wang, E.M., Li, W. and Braithwaite, G. Pilots’ visual scan patterns and situation awareness in flight operations, Aviat Space Environ Med, 2014, 85, (7), pp 708714. https://doi.org/10.3357/asem.3847.2014.CrossRefGoogle ScholarPubMed
Behrend, J. and Dehais, F. How role assignment impacts decision-making in high-risk environments: evidence from eye-tracking in aviation, Saf Sci, 2020, 127, p 104738. https://doi.org/10.1016/j.ssci.2020.104738.CrossRefGoogle Scholar
Costela, F.M. and Castro-Torres, J.J. Risk prediction model using eye movements during simulated driving with logistic regressions and neural networks, Transport Res F: Traf Psychol Behav, 2020, 74, pp 511521. https://doi.org/10.1016/j.trf.2020.09.003.CrossRefGoogle Scholar
Liang, Y., Reyes, M.L. and Lee, J.D. Real-time detection of driver cognitive distraction using support vector machines, IEEE Trans Intell Transp Syst, 2007, 8, (2), pp 340350.CrossRefGoogle Scholar
Nguyen, V.C., Vu, D., Lam, S.T.H. Mel-frequency cepstral coefficients for eye movement identification, Proceedings of the 2012 IEEE 24th International Conference on Tools with Artificial Intelligence (ICTAI), vol. 1, 2012, pp 253260. https://doi.org/10.1109/ICTAI.2012.42.CrossRefGoogle Scholar
Frutos-Pascual, M. and Garcia-Zapirain, B. Assessing visual attention using eye tracking sensors in intelligent cognitive therapies based on serious games, Sensors, 2021, 15, (5), pp 1109211117. https://doi.org/10.3390/s150511092.CrossRefGoogle Scholar
Taku, H., Hirotoshi, I., Akira, Y. and Hiroaki, K. Detecting cognitive distraction using random forest by considering eye movement type, Int J Cognit Inform Nat Intell, 2017, 11, (1), pp 1628. https://doi.org/10.4018/IJCINI.2017010102.Google Scholar
Cheng, B., Zhang, C., Ding, X. and Wu, X. Convolutional neural network implementation for eye movement recognition based on video, 2018 33rd Youth Academic Annual Conference of Chinese Association of Automation, pp 179184. https://doi.org/10.1109/YAC.2018.8406368.CrossRefGoogle Scholar
Zhao, M., Zhong, S., Fu, X., Tang, B. and Pecht, M. Deep residual shrinkage networks for fault diagnosis, IEEE Trans Ind Inf, 2020, 16, (7), pp 46814690. https://doi.org/10.1109/TII.2019.2943898.CrossRefGoogle Scholar
Cohen, J. Statistical power analysis for the behavioral sciences, Comput Environ Urban Syst, 1988, 14, (1), p 71. https://doi.org/10.4324/9780203771587.Google Scholar
Henderson, J.M. an dLuke, S.G. Stable individual differences in saccadic eye movements during reading, pseudoreading, scene viewing, and scene search, J Exp Psychol Human Percep Perform, 2014, 40, (4), pp 13901400. https://doi.org/10.1037/a0036330.CrossRefGoogle ScholarPubMed
Lou, Y., Liu, Y., Kaakinen, J.K., et al. Using support vector machines to identify literacy skills: Evidence from eye movements. Behav Res, 2017, 49, pp 887895. https://doi.org/10.3758/s13428-016-0748-7.CrossRefGoogle ScholarPubMed
Destyanto, T. and Lin, R.F. Detecting computer activities using eye-movement features, J Ambient Intell Hum Comput, 2020, 4, pp 111. https://doi.org/10.1007/s12652-020-02683-8.Google Scholar
Bruzzone, L. and Serpico, S.B. A technique for feature selection in multiclass problems, Int J Remote Sens, 2000, 21, (3), pp 549563. https://doi.org/10.1080/014311600210740.CrossRefGoogle Scholar
Hughes, G. On the mean accuracy of statistical pattern recognizers, IEEE Trans Inf Theory, 1968, 14, (1), pp 5563. https://doi.org/10.1109/TIT.1968.1054102.CrossRefGoogle Scholar
Liao, H., Dong, WH., Huang, HS., et al. Inferring user tasks in pedestrian navigation from eye movement data in real-world environments, Int J Geograph Inform Sci, 2019, 33, (4), pp 739763. https://doi.org/10.1080/13658816.2018.1482554.CrossRefGoogle Scholar
Kim, N., Kim, J. and Ahn, C.R. Predicting workers’ inattentiveness to struck-by hazards by monitoring biosignals during a construction task: A virtual reality experiment, Adv Eng Inf, 2021, 49, p 101359. https://doi.org/10.1016/j.aei.2021.101359.CrossRefGoogle Scholar
Charlton, S.G., Starkey, N.J., et al. What’s the risk? A comparison of actual and perceived driving risk, Transport Res F: Traf Psychol Behav, 2014, 25, Part A, pp 5064. https://doi.org/10.1016/j.trf.2014.05.003.CrossRefGoogle Scholar
Daley, M.S., Gever, D., Posada-Quintero, H.F., et al. Machine learning models for the classification of sleep deprivation induced performance impairment during a psychomotor vigilance task using indices of eye and face tracking, Front Artif Intell, 2020, 3, p 17. https://doi.org/10.3389/frai.2020.00017.CrossRefGoogle ScholarPubMed
Abdelrahman, Y., Khan, A.A., Newn, J., et al. Classifying attention types with thermal imaging and eye tracking, Proc ACM Interact Mob Wearable Ubiquitous Technol, 2019, 3, (3), pp 127. https://doi.org/10.1145/3351227.CrossRefGoogle Scholar