Hostname: page-component-8448b6f56d-gtxcr Total loading time: 0 Render date: 2024-04-23T19:30:25.168Z Has data issue: false hasContentIssue false

Multi-cameras visual servoing for dual-arm coordinated manipulation

Published online by Cambridge University Press:  12 January 2017

Jiadi Qu
Affiliation:
State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, P. R. China. E-mails: qujiadi@hotmail.com, 18745007416@163.com
Fuhai Zhang*
Affiliation:
State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, P. R. China. E-mails: qujiadi@hotmail.com, 18745007416@163.com
Yili Fu*
Affiliation:
State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, P. R. China. E-mails: qujiadi@hotmail.com, 18745007416@163.com
Shuxiang Guo
Affiliation:
State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, P. R. China. E-mails: qujiadi@hotmail.com, 18745007416@163.com
*
*Corresponding author. E-mail: zfhhit@hit.edu.cn

Summary

Although image-based visual servoing (IBVS) provides good performance in many dual-arm manipulation applications, it reveals some fatal limitations when dealing with a large position and orientation uncertainty. The object features may leave the camera's field of view, and the dual-arm robot may not converge to their goal configurations. In this paper, a novel vision-based control strategy is presented to resolve these limitations. A visual path planning method for dual-arm end-effector features is proposed to regulate the large initial poses to the pre-alignment poses. Then, the visual constraints between the position and orientation of two objects are established, and the sequenced subtasks are performed to attain the pose alignment of two objects by using a multi-tasks IBVS method. The proposed strategy has been implemented on a MOTOMAN robot to perform the alignment tasks of plug–socket and cup–lid, and results indicate that the plug and socket with the large initial pose errors 145.4 mm, 43.8○ (the average errors of three axes) are successfully aligned with the allowed pose alignment errors 3.1 mm, 1.1○, and the cup and lid with the large initial pose errors 131.7 mm, 20.4○ are aligned with the allowed pose alignment errors −2.7 mm, −0.8○.

Type
Articles
Copyright
Copyright © Cambridge University Press 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Stuckler, J. and Behnke, S. S., “Following Human Guidance to Cooperatively Carry a Large Object,” Proceedings of the IEEE International Conference on Humanoid Robots, Piscataway, NJ: IEEE, Bled, Slovenia (Oct. 2011) pp. 218223.Google Scholar
2. Shin, S. Y. and Kim, C. H., “Human-like motion generation and control for humanoid's dual arm object manipulation,” IEEE Trans. Ind. Electron. 62 (4), 22652276 (Apr. 2015).Google Scholar
3. Ren, Y., Liu, Y. C., Jin, M. H. and Liu, H., “Biomimetic object impedance control for dual-arm cooperative 7-DOF manipulators,” Robot. Auton. Syst. 75, 273287 (Jan. 2016).CrossRefGoogle Scholar
4. Dauchez, P., Fraisse, P. and Pierrot, F., “A Vision/Position/Force Control Approach for Performing Assembly Tasks with a Humanoid Robot,” Proceedings of the IEEE/RSJ International Conference on Humanoid Robots, Piscataway, NJ: IEEE, Tsukuba, Japan (Dec. 2005) pp. 277282.Google Scholar
5. Yamazaki, K., Oya, R., Nagahama, K., Okada, K. and Inaba, M., “Bottom dressing by a dual-arm robot using a clothing state estimation based on dynamic shape changes,” Int. J. Adv. Robot. Syst. 13 (5), 113 (Jan. 2016).Google Scholar
6. Beetz, M., Klank, U., Maldonado, A., Pangercic, D. and Ruhr, T., “Robotic Roommates Making Pancakes,” Proceedings of the IEEE International Conference on Humanoid Robots, Piscataway, NJ: IEEE, Bled, Slovenia (Oct. 2011) pp. 529536.Google Scholar
7. Smith, C., Karayiannidis, Y., Nalpantidis, L., Gratal, X., Qi, P., Dimarogonas, D. V. and Kragic, D., “Dual arm manipulation: A survey,” Robot. Auton. Syst. 60, 13401353 (Oct. 2012).Google Scholar
8. Kragic, D. and Christensen, H. I., “Survey on visual servoing for manipulation,” Royal Inst. Technol., Stockholm, Sweden, Tech. Rep. ISRN KTH/NA/P-02/01-SE, (2002).Google Scholar
9. Hutchinson, S., Hager, G. D. and Corke, P. I., “A tutorial on visual servo control,” IEEE Trans. Robot. Automat. 12 (5), 651670 (Feb. 1996).CrossRefGoogle Scholar
10. Kermorgant, O. and Chaumette, F., “Multi-sensor Data Fusion in Sensor-based Control: Application to Multi-camera Visual Servoing,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Piscataway, NJ: IEEE, ShangHai, China (May 2011) pp. 45184523.Google Scholar
11. Malis, E., Chaumette, F. and Boudet, S., “Multi-cameras Visual Servoing,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Piscataway, NJ: IEEE, San Francisco, CA (Apr. 2000) pp. 31833188.Google Scholar
12. Chaumette, F. and Hutchinson, S., “Visual servo control, part i: Basic approaches,” IEEE Robot, Automat. Mag. 13 (4), 8290 (Dec. 2006).CrossRefGoogle Scholar
13. Chaumette, F. and Hutchinson, S., “Visual servo control, part ii: Advanced approaches,” IEEE Robot, Automat. Mag. 14 (1), 109118 (Mar. 2007).Google Scholar
14. Lippiello, V., Siciliano, B. and Villani, L., “Position-based Visual Servoing in Industrial Multi-robot Cells Using a Hybrid Camera Configuration,” IEEE Trans. Robot. 23 (1), 7386 (Feb. 2007).Google Scholar
15. Shauri, R. L. A. and Nonami, K., “Assembly manipulation of small objects by dual-arm manipulator,” Assem. Autom. 31 (3), 263274 (2011).Google Scholar
16. Vahrenkamp, N., Boge, C., Welke, K., Asfour, T., Walter, J. and Dillmann, R., “Visual Servoing for Dual Arm Motions on a Humanoid Robot,” Proceedings of the IEEE/RSJ International Conference on Humanoid Robots, Paris, France (Dec. 2009) pp. 208214.Google Scholar
17. Huebner, K., Welke, K., Przybylski, M., Vahrenkamp, N., Asfour, T., Kragic, D. and Dillmann, R., “Grasping Known Objects with Humanoid Robots; a Box-based Approach,” Proceedings of the International Conference on Advanced Robitics (ICAR), Munich, Germany (2009) pp. 179184.Google Scholar
18. Shen, Y., Xu, D., Tan, M. and Yu, J. Z., “Mixed visual control methods for robots with self-calibrated stereo rig,” IEEE Trans. Instrum. Meas. 59 (2), 470479 (Feb. 2010).Google Scholar
19. Han, S., See, W., Lee, J., Lee, M. and Hashimoto, H., “Image-based Visual Servoing Control of a Scara Type Dual-arm Robot,” Proceedings of the IEEE International Symposium Industrial Electronics (ISIE), Cholula, Mexico (2000) vol. 2, pp. 517522.Google Scholar
20. Hynes, P., Dodds, G. and Wilkinson, A. J., “Uncalibrated Visual Servoing of a Dual Arm Robot for Mis Suturing,” Proceedings of the IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Pisa, Italy (Feb. 2006) pp. 420425.Google Scholar
21. Kruse, D., Wen, J. T. and Radke, R. J., “A sensor-based dual-arm tele-robotic system,” IEEE Trans. Autom. Sci. Eng. 12 (1), 418 (Jan. 2015).CrossRefGoogle Scholar
22. Zereik, E., Sorbara, A., Casalino, G. and Didot, F., “Autonomous Dual-arm Mobile Manipulator Crew Assistant for Surface Operations: Force/vision –Guided Grasping,” Proceedings of the International Conference on Recent advances in Space Technologies, Istanbul, Turkey (Jun. 2009) pp. 710715.Google Scholar
23. Miyabe, T., Konno, A. and Uchiyama, M., “Automated Object Capturing with a Two-arm Flexible Manipulator,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Taipei, Taiwan China (May 2003) vol. 2, pp. 25292534.Google Scholar
24. Fleurmond, R. and Cadenat, V., “Multi-cameras Visual Servoing to Perform a Coordinated Task Using a Dual Arm Robot,” Proceedings of the International Conference on Informatics in control, automation and Robotics (ICINCO), Vienna, Austria (2014) vol. 2, pp. 3744.Google Scholar
25. Huang, S., Murakami, K., Yamakawa, Y., Senoo, T. and Ishikawa, M., “Fast Peg-and-hole Alignment Using Visual Compliance,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan (Nov. 2013) pp. 286292.Google Scholar
26. Kim, H. W., Cho, J. S. and Kweon, I. S., “A Novel Image-Based Control Law for the Visual Servoing System Under Large Pose Error,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Takamatsu, Japan (Oct. 2000) vol. 1, pp. 263268.Google Scholar
27. Kermorgant, O. and Chaumette, F., “Dealing with constraints in sensor-based robot control,” IEEE Trans. Robot. 30 (1), 244257 (Feb. 2014).Google Scholar
28. Schramm, F., Geffard, F., Morel, G. and Micaelli, A., “Calibration Free Image Point Path Planning Simultaneously Ensuring Visibility and Controlling Camera Path,” Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy (2007) pp. 20742079.Google Scholar
29. Schramm, F., Micaelli, A. and Morel, G., “Calibration Free Path Planning for Visual Servoing Yielding Straight Line Behavior Both in Image and Work Space,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Edmonton, Canada (Oct. 2005) vol. 1, pp. 22162221.Google Scholar
30. Mansard, N. and Chaumette, F., “Tasking sequencing for high-level sensor-based control,” IEEE Trans. Robot. 23 (1), 6072 (Feb. 2007).Google Scholar