Skip to main content Accessibility help
×
Home

Development of a sensor fusion method for crop row tracking operations

  • B. Benet (a1), R. Lenain (a1) and V. Rousseau (a1)

Abstract

A sensor fusion method was developed in order to track crop rows, considering various vegetation levels, for various crops. This application consisted to use a laser sensor, an inertial measurement unit and a color camera, in a fusion mode, to get a set of points corresponding to crop rows and eliminate noise like grass or leaves in environment, in real time. After applying a method such as Hough or Least Square (LS) technique for obtaining the geometric data of the crop line, automatic control operations were applied to realize the crop row tracking operation, with the desired lateral deviation parameter, taking into account the robot angular deviation and the temporal aspect, to realize the task with accuracy and without oscillations. The results showed the robustness of fusion method, to get a stable autonomous navigation for crop row tracking, particularly in the vineyards, with many perturbations such as bumps, hole and mud, and speeds between 1 and 2 m s-1. The mean lateral error between desired and obtained trajectory varied between 0.10 and 0.40 m, depending of speed and perturbations.

Copyright

Corresponding author

References

Hide All
Åstrand, B and Baerveldt, AJ 2002. An Agricultural Mobile Robot with Vision-Based Perception for Mechanical Weed Control. Autonomous robot, July 2002 13 (1), 2135.
Billingssley, J and Schoenfisch, M 1995. Vision-Guidance of Agricultural Vehicles. Autonomous Robots 2 (1), 6576.
Choi, KH, Han, SK, Han, SH, Park, KH, Kim, KS and Kim, S 2015. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Computers and Electronics in Agriculture 113, 266274.
Duda, RO and Hart, PE 1972. Use of the Hough transformation to detect line and curves in pictures. ACM 15 (1), 1115.
Garcia-Alegre, S, Martin, D, Garcia-Alegre, G and Guinea Diaz, D 2011. Real-Time Fusion of Visual Images and Laser Data Images for Safe Navigation in Outdoor Environments. Center for Automation and Robotics Spanish Council for Scientific Research, Spain.
Han, S, Zhang, Q, Ni, B and Reid, JF 2003. A guidance directrix approach to vision-based vehicle guidance systems. Computers and Electronics in Agriculture 43 (3), 179195.
Haselick, M, Arends, M, Lang, D and Paulus, D 2012. Terrain Classification with Markov Random Fields on fused Camera and 3D Laser Range. DataVision Group, AGAS Robotics, University of Koblenz-Landau, 56070 Koblenz, Germany.
Hyeran, B and Lee, SW 2002. Applications of Support Vector Machines for Pattern Recognition: A Survey. In Pattern Recognition with Support Vector Machines: First International Workshop, SVM 2002 Proceedings, Springer Berlin Heidelberg, Germany, pp. 213–216.
Keicher, R and Seufert, H 2000. Automatic guidance for agricultural vehicles in Europe. Computers and Electronics in Agriculture 25, 169194.
Ming, L, Kenji, I, Katsuhiro, W and Shinya, Y 2008. Review of research on agricultural vehicle autonomous guidance. International Journal of Agricultural and Biological Engineering 2 (3), 116.
Shalal, N, Low, T, McCarthy, C and Hancock, N 2013. A preliminary evaluation of vision and laser sensing for tree trunk detection and orchard mapping. Proceedings of Australasian Conference on Robotics and Automation, 2-4 Dec 2013, University of New South Wales, Sydney Australia.
Subramanian, V, Burks, TF and Arroyo, AA 2006. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Computers and Electronics in Agriculture 53, 130143.
Takai, R, Barawid, O, Ishii, K and Noguchi, N 2014. Development of Crawler-Type Robot Tractor based on GPS and IM. Engineering in Agriculture, Environment and Food 7 (4), 143147.

Keywords

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed