Skip to main content Accessibility help

Obstacle Avoidance through Gesture Recognition: Business Advancement Potential in Robot Navigation Socio-Technology

  • Xuan Liu (a1), Kashif Nazar Khan (a2), Qamar Farooq (a1) (a3), Yunhong Hao (a1) and Muhammad Shoaib Arshad (a2)...


In the present modern age, a robot works like human and is controlled in such a manner that its movements should not create hindrance in human activities. This characteristic involves gesture feat and gesture recognition. This article is aimed to describe the developments in algorithms devised for obstacle avoidance in robot navigation which can open a new horizon for advancement in businesses. For this purpose, our study is focused on gesture recognition to mean socio-technological implication. Literature review on this issue reveals that movement of robots can be made efficient by introducing gesture-based collision avoidance techniques. Experimental results illustrated a high level of robustness and usability of the Gesture recognition (GR) system. The overall error rate is almost 10%. In our subjective judgment, we assume that GR system is very well-suited to instruct a mobile service robot to change its path on the instruction of human.


Corresponding author

*Corresponding author. E-mail:


Hide All
1.Hao, Y., Zhang, Y. and Farooq, Q., “The contribution of leading firms in environmental sustainability: Dampening the detrimental effect of political capital ties,” Int. J. Environ. Sci. Technol. 15(12), 25812594 (2018).
2.Wang, Z., Wen, X., Song, Y., Mao, X., Li, W. and Chen, G., “Navigation of a humanoid robot via head gestures based on global and local live videos on google glass,” In: IEEE International Instrumentation and Measurement Technology Conference, Turin, Italy (2017) pp. 16.
3.Berri, R., Wolf, D. and Osório, F., “Telepresence robot with image-based face tracking and 3d perception with human gesture interface using kinect sensor,” In: 2014 Joint Conference on Robotics: SBR-LARS Robotics Symposium and Robocontrol (SBR LARS Robocontrol), IEEE, Sao Carlos, Brazil (2014) pp. 205210.
4.Benavidez, P. and Mo, J., “Mobile robot navigation and target tracking system,” In: International Conference on System of Systems Engineering, Albuquerque, NM, USA (2011) pp. 299304.
5.Bax, L., “Human-robot interactive collision prevention: Improved navigation in home environments,” (2012). Master Thesis. Retrieved from
6.Huber, E. and Kortenkamp, D., “Using stereo vision to pursue moving agents with a mobile robot,” In: Proceedings of 1995 IEEE International Conference on Robotics and Automation, Nagoya, Japan (1995) p. V13.
7.Kortenkamp, D., Huber, E. and Bonasso, R. P., “Recognizing and interpreting gestures on a mobile robot,” AAAI/IAAI 2 (1996) pp. 915921. Retrieved from
8.Kahn, R. E., Swain, M. J., Prokopowicz, P. N. and Firby, R. J., “Gesture recognition using the perseus architecture,” In: Proceedings CVPR’96, 1996 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE (1996) pp. 734741.
9.Boehme, H.-J., Braumann, U.-D., Brakensiek, A., Corradini, A., Krabbes, M. and Gross, H.-M., “User localisation for visually-based human-machine-interaction,” Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition, Nara, Japan (1998) p. 486.
10.Cui, Y. and Weng, J. J., “Hand segmentation using learning-based prediction and verification for hand sign recognition,” In: Proceedings CVPR’96, 1996 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE (1996) pp. 8893.
11.Triesch, J. and Von Der Malsburg, C., “Robotic gesture recognition,” In: International Gesture Workshop, Springer (1997) pp. 233244.
12.Wren, C. R., Azarbayejani, A., Darrell, T. and Pentland, A. P., “Pfinder: Real-time tracking of the human body,” IEEE Trans. Pattern Anal. Mach. Intell. 19(7), 780785 (1997).
13.Pentland, A., Moghaddam, B. and Starner, T., “View-based and modular eigenspaces for face recognition,” (1994). doi:
14.Wilson, A. D. and Bobick, A. F., “Parametric hidden Markov models for gesture recognition,” IEEE Trans. Pattern Anal. Mach. Intell. 21(9), 884900 (1999).
15.Han, K. M., Collision Free Path Planning Algorithms for Robot Navigation Problem (University of Missouri, Columbia, 2007). doi:10.32469/10355/5021
16.Ryan, D. J., Finger and Gesture Recognition with Microsoft Kinect (University of Stavanger, Norway, 2012).
17.Hao, Y., Farooq, Q. and Zhang, Y., “Unattended social wants and corporate social responsibility of leading firms: Relationship of intrinsic motivation of volunteering in proposed welfare programs and employee attributes,” Corporate Social Responsibility Environ. Manage. 25(6), 10291038 (2018).
18.Albrektsen, S. M., Using the Kinect Sensor for Social Robotics (Institutt for teknisk kybernetikk, Trondheim, Norway, 2011).
19.Farooq, Q., Fu, P., Hao, Y., Jonathan, T. and Zhang, Y., “A review of management and importance of E-Commerce implementation in service delivery of private express enterprises of China,” SAGE Open 9(1), (2019). doi:10.1177/2158244018824194
20.Hoxey, T. and Stephenson, I., “Integrating live skeleton data into a VR environment,” (2018). Retrieved from
21.Müller, M., “Dynamic time warping,” In: Information Retrieval for Music and Motion (Springer, Berlin, Heidelberg, 2007) pp. 6984.
22.Celebi, S., Aydin, A. S., Temiz, T. T. and Arici, T., “Gesture recognition using skeleton data with weighted dynamic time warping,” VISAPP (1), 620625 (2013).
23.Raptis, M., Kirovski, D. and Hoppe, H., “Real-time classification of dance gestures from skeleton animation,” In: Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, ACM (2011) pp. 147156.
24.Hao, Y., Farooq, Q. and Sun, Y., “Development of theoretical framework and measures for the role of social media in realizing corporate social responsibility through native and non-native communication modes: moderating effects of cross-cultural management”. Corporate Soc. Respons. Environ. Manage. 25(4), 704711 (2018).
25.Fahimi, F., Nataraj, C. and Ashrafiuon, H., “Real-time obstacle avoidance for multiple mobile robots,” Robotica 27(2), 189198 (2009).
26.Tanveer, M. H., Recchiuto, C. T. and Sgorbissa, A., “Analysis of path following and obstacle avoidance for multiple wheeled robots in a shared workspace,” Robotica 37(1), 80108 (2018).
27.Farooq, Q., Hao, Y. and Liu, X.Understanding Corporate Social Responsibility with Cross-Cultural Difference: A Deeper Look at Religiosity,” Corporate Social Responsibility and Environmental Management (2019). doi:10.1002/csr.1736


Obstacle Avoidance through Gesture Recognition: Business Advancement Potential in Robot Navigation Socio-Technology

  • Xuan Liu (a1), Kashif Nazar Khan (a2), Qamar Farooq (a1) (a3), Yunhong Hao (a1) and Muhammad Shoaib Arshad (a2)...


Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed