Hostname: page-component-76fb5796d-25wd4 Total loading time: 0 Render date: 2024-04-27T18:22:42.582Z Has data issue: false hasContentIssue false

Interpretable long-term gait trajectory prediction based on Interpretable-Concatenation former

Published online by Cambridge University Press:  20 November 2023

Jie Yin
Affiliation:
Department of Automation, Tsinghua University, Beijing, P.R. China
Meng Chen
Affiliation:
Aerospace System Engineering Shanghai, ASES, Shanghai, P.R. China
Chongfeng Zhang
Affiliation:
Shanghai Academy of Spaceflight Technology, SAST, Shanghai, P.R. China
Tao Xue
Affiliation:
Department of Automation, Tsinghua University, Beijing, P.R. China
Ming Zhang
Affiliation:
College of Engineering and Physical Sciences, Aston University, Birmingham, UK
Tao Zhang*
Affiliation:
Department of Automation, Tsinghua University, Beijing, P.R. China
*
Corresponding author: Tao Zhang; Email: taozhang@mail.tsinghua.edu.cn

Abstract

Human gait trajectory prediction is a long-standing research topic in human–machine interaction. However, there are two shortcomings in the current gait trajectory prediction technology. The first shortcoming is that the neural network model of gait prediction only predicts dozens of future time frames of gait trajectory. The second shortcoming is that the gait prediction neural network model is uninterpretable. We propose the Interpretable-Concatenation former (IC-former) model, which can predict long-term gait trajectories and explain the prediction results by quantifying the importance of data at different positions in the input sequence. Experiments prove that the IC-former model we proposed not only makes a breakthrough in prediction accuracy but also successfully explains the data basis of the prediction.

Type
Research Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Xue, T., Wang, Z., Zhang, T. and Zhang, M., Adaptive oscillator-based robust control for flexible hip assistive exoskeleton,” IEEE Robot. Automat. Lett. 4, 33183323 (2019).CrossRefGoogle Scholar
Seo, K., Kim, K., Park, Y. J., Cho, J.-K., Lee, J., Choi, B., Lim, B., Lee, Y. and Shim, Y.. Adaptive Oscillator-based Control for Active Lower-Limb Exoskeleton and Its Metabolic Impact. In: IEEE International Conference on Robotics and Automation (ICRA), IEEE (2018) pp. 67526758.Google Scholar
Tanghe, K., De Groote, F., Lefeber, D., De Schutter, J. and Aertbeliën, E., Gait trajectory and event prediction from state estimation for exoskeletons during gait,” IEEE Trans. Neural Syst. Rehabil. Eng. 28, 211220 (2019).CrossRefGoogle ScholarPubMed
Kang, I., Kunapuli, P. and Young, A. J., Real-time neural network-based gait phase estimation using a robotic hip exoskeleton,” IEEE Trans. Med. Robot. Bion. 99, 11 (2019).Google Scholar
Wang, Y., Li, Z., Wang, X., Yu, H., Liao, W. and Arifoglu, D., Human gait data augmentation and trajectory prediction for lower-limb rehabilitation robot control using gans and attention mechanism,” Machines 9, 367 (2021).CrossRefGoogle Scholar
Zaroug, A., Garofolini, A., Lai, D. T., Mudie, K. and Begg, R., Prediction of gait trajectories based on the long short term memory neural networks,” PLoS One 16, e0255597(2021).CrossRefGoogle ScholarPubMed
Zaroug, A., Lai, D. T., Mudie, K. and Begg, R., Lower limb kinematics trajectory prediction using long short-term memory neural networks,” Front. Bioeng. Biotechnol. 8, 362 (2020).CrossRefGoogle ScholarPubMed
Su, B. and Gutierrez-Farewik, E. M., Gait trajectory and gait phase prediction based on an lstm network,” Sensors (Basel) 20, 7127 (2020).CrossRefGoogle Scholar
Liu, D.-X., Wu, X., Wang, C. and Chen, C.. Gait Trajectory Prediction for Lower-Limb Exoskeleton based on Deep Spatial-Temporal Model (DSTM). In: 2nd International Conference on Advanced Robotics and Mechatronics (ICARM), IEEE (2017) pp. 564569.Google Scholar
Karakish, M., Fouz, M. A. and ELsawaf, A., “Gait trajectory prediction on an embedded microcontroller using deep learning,” Sensors (Basel) 22, 8441 (2022).CrossRefGoogle Scholar
Kang, I., Molinaro, D. D., Duggal, S., Chen, Y., Kunapuli, P. and Young, A. J., Real-time gait phase estimation for robotic hip exoskeleton control during multimodal locomotion, IEEE Robot. Autom. Lett. 6, 34913497 (2021).CrossRefGoogle ScholarPubMed
Cao, J., Li, Z. and Li, J., Financial time series forecasting model based on CEEMDAN and LSTM,” Phys. A: Stat. Mech. Appl. 519, 127139 (2019).CrossRefGoogle Scholar
Sagheer, A. and Kotb, M., Unsupervised pre-training of a deep lstm-based stacked autoencoder for multivariate time series forecasting problems,” Sci. Rep. 9, 19038 (2019b).10.1038/s41598-019-55320-6CrossRefGoogle ScholarPubMed
Sagheer, A. and Kotb, M., Time series forecasting of petroleum production using deep lstm recurrent networks,” Neurocomputing 323, 203213 (2019a).CrossRefGoogle Scholar
Hochreiter, S. and Schmidhuber, J., Long short-term memory,” Neural Comput. 9, 17351780 (1997).CrossRefGoogle ScholarPubMed
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł. and Polosukhin, I., “Attention is all you need, Comput. Lang. (2017).Google Scholar
Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A. and Eickhoff, C.. A Transformer-based Framework for Multivariate Time Series Representation Learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021, 21142124.CrossRefGoogle Scholar
Xu, M., Dai, W., Liu, C., Gao, X., Lin, W., Qi, G.-J. and Xiong, H., Spatial-temporal transformer networks for traffic flow forecasting,” Signal Process. (2020).Google Scholar
Kitaev, N., Kaiser, Ł. and Levskaya, A., Reformer: The efficient transformer,” Mach. Learn. (2020). arXiv:2001.04451.Google Scholar
Wang, S., Li, B. Z., Khabsa, M., Fang, H. and Ma, H., Linformer: Self-attention with linear complexity,” Mach. Learn. (2020).Google Scholar
Li, S., Jin, X., Xuan, Y., Zhou, X., Chen, W., Wang, Y.-X. and Yan, X., Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting,” Mach. Learn. (2019).Google Scholar
Beltagy, I., Peters, M. E. and Cohan, A., Longformer: The long-document transformer,” Comput. Lang. (2020).Google Scholar
Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H. and Zhang, W.. Informer: Beyond Efficient Transformer for Long Sequence Time-series Forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, 35, (2021a), 1110611115.Google Scholar
Chen, J., Song, L., Wainwright, M. and Jordan, M.. Learning to Explain: An Information-Theoretic Perspective on Model Interpretation. In: International Conference on Machine Learning, PMLR (2018) 883892.Google Scholar
Jain, S. and Wallace, B. C., Attention is not explanation,” Comput Lang. (2019). arXiv:1902.10186.Google ScholarPubMed
Vig, J. and Belinkov, Y., Analyzing the structure of attention in a transformer language model,” Comput. Lang. (2019).Google Scholar
Vashishth, S., Upadhyay, S., Tomar, G. S. and Faruqui, M., Attention interpretability across NLP tasks,” Comput. Lang. (2019).Google Scholar
Oreshkin, B. N., Carpov, D., Chapados, N. and Bengio, Y., N-beats: Neural basis expansion analysis for interpretable time series forecasting,” Mach. Learn. (2019).Google Scholar
Lim, B., Arık, S.Ö., Loeff, N. and Pfister, T., Temporal fusion transformers for interpretable multi-horizon time series forecasting,” Mach. Learn. (2021).Google Scholar
Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H. and Zhang, W.. Informer: Beyond Efficient Transformer for Long Sequence Time-series Forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, 35, (2021b), 1110611115.Google Scholar
Ariyo, A. A., Adewumi, A. O. and Ayo, C. K.. Stock Price Prediction Using the Arima Model. In: 2014 UKSim-AMSS 16th International Conference on Computer Modelling and Simulation, IEEE (2014) pp. 106112.Google Scholar
Taylor, S. J. and Letham, B., Forecasting at scale,” PeerJ (2018).Google Scholar
Bahdanau, D., Cho, K. and Bengio, Y., “Neural machine translation by jointly learning to align and translate,” Comput. Lang. (2014). arXiv preprint arXiv:1409.0473.Google Scholar
Lai, G., Chang, W.-C., Yang, Y. and Liu, H.. Modeling Long-and Short-term Temporal Patterns with Deep Neural Networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval (2018) pp. 95104.Google Scholar
Salinas, D., Flunkert, V., Gasthaus, J. and Januschowski, T., DeepAR: Probabilistic forecasting with autoregressive recurrent networks,” Int. J. Forecast. 36, 11811191 (2020).CrossRefGoogle Scholar