Skip to main content Accessibility help
×
Home

ACT-R-typed human–robot collaboration mechanism for elderly and disabled assistance

  • Shuo Xu (a1), Dawei Tu (a1), Yongyi He (a1), Shili Tan (a1) and Minglun Fang (a1)...

Summary

This work aims to propose an innovative mechanism of human–robot collaboration (HRC) for mobile service robots in the application of elderly and disabled assistance. Previous studies on HRC mechanism usually focused on integrating decision-making intelligence of human beings by qualitative judgment and reasoning intelligence of robots by quantitative calculation. Instead, novelties of the proposed methodology include (1) constructing an HRC framework by taking reference from the Adaptive Control of Thought – Rational (ACT-R) human cognitive architecture; (2) establishing semantic webs of cognitive reasoning through human–robot interaction (HRI) and HRC to plan and implement complex tasks; and (3) realizing human–robot intelligence fusion by mutual encouragement, connect, and integration of modules of human, robot, perception, HRI, and HRC in the ACT-R architecture. Its technical feasibility is validated by some selected experiments within a “pouring” scenario. Further, although this study is oriented to mobile service robots, the modularized design of hardware and software makes its extensive use feasible in other types of service robots like smart rehabilitation beds, wheelchairs, and cleaning equipments.

Copyright

Corresponding author

*Corresponding author. Email: sxu@shu.edu.cn

References

Hide All
1.Breazeal, C., Gray, J., Hoffman, G. and Berlin, M., “Social Robots: Beyond Tools to Partners. In: Proceedings of the 13th IEEE International Workshop on Robot and Human Interactive Communication (Ro-Man 2004), Kurashiki, Japan (IEEE Press, New York, NY, 2004) pp. 551556.
2.Hinds, P. J., Roberts, T. L. and Jones, H, “Whose job is it anyway? A study of human–robot interaction in a collaborative task,” Hum. Comput. Interact. 19 (1–2), 151181 (2004).
3.Kim, Y. C., Yoon, W. C., Kwon, H. T., Yoon, Y. S. and Kim, H. J, “A Cognitive Approach to Enhancing Human–Robot Interaction for Service Robots,” In: Lecture Notes in Computer Science: Human Interface and the Management of Information: Methods, Techniques and Tools in Information Design (Smith, M. J. and Salvendy, G., eds.) (Springer, Berlin, Germany, 2007) pp. 858867.
4.DeKoven, E. A. M., Bechtel, B., Zaientz, J., Lisse, S. and Murphy, A. K. G., “Delegating Responsibilities in Human-Robot Teams,” In: Unmanned Systems Technology, Vol. VIII, Pts. 1 and 2 (Gerhart, G. R., Shoemaker, C. M. and Gage, D. W., eds.) (SPIE Press, Bellingham, WA, 2006).
5.Howard, A. M., “Role Allocation in Human–Robot Interaction Schemes for Mission Scenario Execution,” In: Proceedings of 2006 IEEE International Conference on Robotics and Automation (IEEE Press, New York, NY, 2006) pp. 35883594.
6.Clodic, A., Alami, R., Montreuil, V., Li, S., Wrede, B. and Swadzba, A, “A Study of Interaction Between Dialog and Decision for Human-Robot Collaborative Task Achievement,” In: Proceedings of the 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, South Korea (IEEE Press, New York, NY, 2007) pp. 907912.
7.Xu, Y., Ohmoto, Y., Ueda, K., Komatsu, T., Okadome, T., Kamei, K., Okada, S., Sumi, Y. and Nishida, T., “A Platform System for Developing a Collaborative Mutually Adaptive Agent,” Proceedings of Next-Generation Applied Intelligence (Springer, Berlin, Germany, 2009) pp. 576585.
8.ACT-R Research Group, “About ACT-R,” available at: http://act-r.psy.cmu.edu/about/. Accessed March 27, 2012.
9.Anderson, J. R., Bothell, D., Byrne, M. D., Douglass, S., Lebiere, C. and Qin, Y., “An integrated theory of the mind,” Psych. Rev. 111 (4), 10361060 (2004).
10.Chong, H.-Q., Tan, A.-H. and Ng, G.-W, “Integrated cognitive architectures: A survey,” Artif. Intell. Rev. 28 (2), 103130 (2007).
11.Qiu, R., Noyvirt, A., Ji, Z., Soroka, A., Li, D., Liu, B., Arbeiter, G., Weißhardt, F. and Xu, S., “Integration of symbolic task planning into operations within an unstructured environment,” Int. J. Intell. Mechatronics Robot. 2 (3), 3857 (2012).
12.Ritter, F. E., Van Rooy, D., Amant, R. St and Simpson, K., “Providing user models direct access to interfaces: An exploratory study of a simple interface with implications for HRI and HCI,” IEEE Trans. Syst. Man Cybern. A 36 (3), 592601 (2006).
13.Best, B. J. and Lebiere, C., “Cognitive agents interacting in real and virtual worlds,” In: Cognition and Multi-Agent Interaction: From Cognitive Modeling to Social Simulation (Cambridge University Press, New York, NY, 2006) pp. 186218.
14.Kennedy, W. G., Bugajska, M. D., Marge, M., Adams, W., Fransen, B. R., Perzanowski, D., Schultz, A. C. and Trafton, J. G., “Spatial Representation and Reasoning for human–Robot Collaboration,” In: Proceedings of the Twenty-Second AAAI Conference on Artificial Intelligence, Vancouver, British Columbia (AAAI Press, Menlo Park, CA, 2007) pp. 15541559.
15.Chang, M.-S. and Chou, J.-H, “A robust and friendly human-robot interface system based on natural human gestures,” Int. J. Pattern. Recognit. Artif. Intell. 24 (6), 847866 (2010).
16.Su, M. C. and Chung, M. T., “Voice-controlled human-computer interface for the disabled,” Comput. Control Eng. J. 12 (5), 225230 (2001).
17.Tu, D., Zhao, Q. and Yin, H., “Eye-gaze input system being adaptive to the user's head movement,” Yi Qi Yi Biao Xue Bao (Chin. J. Sci. Instrum.) 25 (6), 828828 (2004).
18.Wang, M., Wang, Y., Tu, D., Jiang, J. and Zhang, G, “Path planning of mobile robot based on compound potential field method,” Appl. Res. Comput. 29 (7), 24472449 + 2460 (2012).
19.Galindo, C., Fernandez-Madrigal, J.-A., Gonzalez, J. and Saffiotti, A., “Robot task planning using semantic maps,” Robot. Auton. Syst. 56 (11), 955–66 (2008).

Keywords

ACT-R-typed human–robot collaboration mechanism for elderly and disabled assistance

  • Shuo Xu (a1), Dawei Tu (a1), Yongyi He (a1), Shili Tan (a1) and Minglun Fang (a1)...

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed