Hostname: page-component-7c8c6479df-24hb2 Total loading time: 0 Render date: 2024-03-28T18:54:32.676Z Has data issue: false hasContentIssue false

A position estimation system for mobile robots using a monocular image of a 3-D landmark

Published online by Cambridge University Press:  09 March 2009

Kyoung C. Koh
Affiliation:
Department of Precision Engineering & Mechatronics, School of Mechanical Engineering, Korea Advanced Institute of Science and Technology, 373–1, Kusong-dong, Yusong-gu, Taejon, 305–701 (Korea)
Jae S. Kim
Affiliation:
Department of Precision Engineering & Mechatronics, School of Mechanical Engineering, Korea Advanced Institute of Science and Technology, 373–1, Kusong-dong, Yusong-gu, Taejon, 305–701 (Korea)
Hyung S. Cho
Affiliation:
Department of Precision Engineering & Mechatronics, School of Mechanical Engineering, Korea Advanced Institute of Science and Technology, 373–1, Kusong-dong, Yusong-gu, Taejon, 305–701 (Korea)

Summary

This paper presents an absolute position estimation system for a mobile robot moving on a flat surface. In this system, a 3-D landmark with four coplanar points and a non-coplanar point is utilized to improve the accuracy of position estimation and to guide the robot during navigation. Applying theoretical analysis, we investigate the image sensitivity of the proposed 3-D landmark compared with the conventional 2-D landmark. In the camera calibration stage of the experiments, we employ a neural network as a computational tool. The neural network is trained from a set of learning data collected at various points around the mark so that the extrinsic and intrinsic parameters of the camera system can be resolved. The overall estimation algorithm from the mark identification to the position determination is implemented in a 32-bit personal computer with an image digitizer and an arithmetic accelerator. To demonstrate the effectiveness of the proposed 3-D landmark and the neural network-based calibration scheme, a series of navigation experiments were performed on a wheeled mobile robot (LCAR) in an indoor environment. The results show the feasibility of the position estimation system applicable to mobile robot's real-time navigation.

Type
Article
Copyright
Copyright © Cambridge University Press 1994

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Julliere, M., Marce, L. and Place, H., “A Guidance System for a Mobile RobotProc. 13th Int. Symp. on Industrial Robots (04, 1983) pp. 5868.Google Scholar
2.Hongo, T., Arakawa, H. et al. , “An Automatic Guidance System of a Self-controlled VehicleIEEE Trans. Ind. Elec. IE-34, No. 1, 510 (02, 1987).CrossRefGoogle Scholar
3.Tsumura, T. and Hashimoto, M., “Position and Guidance Vehicle by Use of Laser and Comer cubeProc. IEEE Int. Conf. Rob. and Auto. (1986) pp. 13351342.Google Scholar
4.Cox, I.J., “Blanche-An Experiment in Guidance and Navigation of an Autonomous Robot VehicleIEEE Trans. Rob. and Auto. RA-7, No. 2, 193204 (04, 1991).CrossRefGoogle Scholar
5.Fishier, M.A. and Bolles, R.C., “Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated CartographyCommun. Ass. Comput. Mach. 24, No. 6, 381395 (06, 1981).Google Scholar
6.Horaud, R., Conio, B. and Leboulleux, O., “An Analytic Solution for the Perspective 4-point ProblemComput. Vision, Grap. Image Processing 47, 3344 (1989).CrossRefGoogle Scholar
7.Yuan, J.S.C., “A General Photogrammetric Method for Determining Objects Position and OrientationIEEE Trans. Rob. and Auto. RA-5, No. 2, 129142 (12, 1989).CrossRefGoogle Scholar
8.Abidi, M.A. and Gonzalez, R.C., “The Use of Multisensor Data for Robotic Applications;” IEEE Trans. Rob. and Auto. RA-6, No. 2, 159177 (04, 1990).CrossRefGoogle Scholar
9.Abidi, M.A. and Chandra, T., “Pose Estimation for Camera Calibration and Landmark TrackingProc. IEEE Int. Conf. Rob. and Auto. 420437 (05, 1990).CrossRefGoogle Scholar
10.Ganaphathy, S., “Decomposition of Transformation Matrices for Robot VisionProc. IEEE Int. Conf. Rob. and Auto. (03, 1984) pp. 130139.Google Scholar
11.Liu, Y. and Huang, T.S., “Determination of Camera Location from 2-D to 3-D Line and Point CorrespondencesIEEE Trans. Pattern Anal. Mach. Intell. PAMI-12, No. 1, 2837 (01, 1990).CrossRefGoogle Scholar
12.Safae-rad, R., “Three-dimensional Location Estimation of Circular Features for Machine VisionIEEE Trans. Rob. and Auto. RA-8, No. 5, 624640 (10, 1992).CrossRefGoogle Scholar
13.Magee, M.J. and Aggarwl, J.K., “Determining the Position of a Robot using a Single Calibration ObjectProc. IEEE Int. Conf. Rob. and Auto. (03, 1984) pp. 140149.Google Scholar
14.Kabuka, M.R. and Arenas, A.E., “Position Verification of a Mobile Robot using Standard PatternIEEE Trans. Rob. and Auto. RA-3, No. 6, 505516 (1987).CrossRefGoogle Scholar
15.Hussain, B. and Kabuka, M.R., “Real-time System for Accurate Three Dimensional Position Determination and VerificationIEEE Trans. Rob. and Auto. RA-6, No. 1, 3143 (Feb., 1990).Google Scholar
16.Fukui, I., “TV Image Processing to Determine the Position of a Robot VehicleIEEE Trans. Automatic Control AC-14, 101109 (12, 1981).Google Scholar
17.Fujiwara, N., Yonezawa, T. and Tsumura, T., “Vehicle Position Finding Method by Means of Processing the Image through a Zoom Lens Camera” Proc. Japan-USA Symp. on Flexible Automation (Kyoto, 1990) pp. 511514.Google Scholar
18.Sugihara, K., “Some Location Problems for Robot Navigation using a Single CameraComput. Vision, Crap. Image Processing 42(1), 112129 (04, 1988).CrossRefGoogle Scholar
19.Kim, J.H. and Cho, H.S., “Real Time Determination of a Mobile Robot's Position by Linear Scanning of a LandmarkRobotica 10, Part 4, 309319 (1992).CrossRefGoogle Scholar
20.Sobel, I., “On Calibration Computer Controlled Cameras for Perceiving 3D ScenesArtificial Intelligence 5, 185198 (1974).CrossRefGoogle Scholar
21.Martines, H.A., Birk, J.R. and Kelly, R.B., “Camera Models based on Data from Two Calibration Planes,“ Computer Grap. and Image Proc. 17, 173180 (1981).CrossRefGoogle Scholar
22.Isaquirre, A., Pu, P. and Summers, J., “A New Development in Camera Calibration: Calibrating a Pair of Mobile CamerasProc. IEEE Int. Conf. Rob. and Auto. (1985) pp. 7479.Google Scholar
23.Tsai, R.Y., “A Versatile Camera Calibration Technique for High-accuracy 3D Machine Vision Metrology using Off-the-shelf TV camera and LensesIEEE Trans. Rob. and Auto. RA-3, No. 4, 323344 (08, 1987).CrossRefGoogle Scholar
24.Gremban, K.D., et al. , “Geometric Camera Calibration using Systems of Linear EquationProc. IEEE Int. Conf. Rob. and Auto. (1988) pp. 562567.Google Scholar
25.Koh, K.C, Kim, J.S. and Cho, H.S., “A Neural Net-based Position Estimation Method for Mobile Robots in Indoor EnvironmentsProc. Int. Symp. on Meas. and Cont. in Rob. (11, 1992) pp. 899906.Google Scholar
26.Rumelhart, D.E. and McClelland, J.L., Parallel Processing, Broadford Books, 1 & 2 (MIT Press, Cambridge, Mass., 1986).Google Scholar
27.Freeman, H., “Computer Processing of Line-drawing ImagesComputing Survey 6, No. 1, 5798 (1974).CrossRefGoogle Scholar
28.Koh, K.C. and Cho, H.S., “A Study on Path Tracking Control Algorithm for Free-ranging Mobile Robots” Ph.D. Dissertation (KAIST, Korea, 03 1993).Google Scholar