Skip to main content Accessibility help
×
Home

Cognitive response navigation algorithm for mobile robots using biological antennas

  • Jiliang Jiang (a1), Dawei Tu (a1), Shuo Xu (a1) and Qijie Zhao (a1)

Summary

We present BioBug, a bionic cognitive response navigation algorithm for mobile robots based on neuroethology principles. It includes a biological antenna model for environment perception and an improved Bug algorithm for motion planning and control. The biological antenna model delineates the interested sensing areas, and thus decreases the computational burden. Then, this obtained environment stimulation is responded to generate the corresponding walking behavior according to BioBug. Simulations and experiments have been carried out in different conditions of obstacle density and boundary shape through algorithm comparisons. Compared with the competitors, BioBug is characterized by not only a smaller path length, but also shorter time for obstacle escape. The results demonstrate the practicality, environment robustness, and obstacle avoidance efficiency of the algorithm.

Copyright

Corresponding author

*Corresponding author. E-mail: tdwshu@staff.shu.edu.cn

References

Hide All
1.Ge, S. S. and Cui, Y. J., “Dynamic motion planning for mobile robots using potential field method,” Auton. Robots 13, 207222 (2002).
2.Borenstein, J. and Koren, Y., “The vector field histogram: Fast obstacle avoidance for mobile robots,” IEEE Trans. Robot. Autom. 7 (3), 278288 (1991).
3.Ghommam, J., Mehrjerdi, H. and Saad, M., “Leader-Follower Formation Control of Nonholonomic Robots with Fuzzy Logic Based Approach for Obstacle Avoidance,” Proceedings of the 2011 IEEE International Conference on Intelligent Robots and Systems, San Francisco, CA, USA (Sep. 25–30, 2011), pp. 23402345.
4.Eduardo, Z., Jaime, G. and Paul, M. and Peran, J., “Adaptive behavior navigation of a mobile robot,” IEEE Trans. Sys. 32 (1), 160169 (2002).
5.Luo, C. and Yang, S., “A bioinspired neural network for real-time concurrent map building and complete coverage robot navigation in unknown environments,” IEEE Trans. Neural Netw. 19 (7), 12791298 (2008).
6.Manikas, T. W., Ashenayi, K. and Wainwright, R. L., “Genetic algorithms for autonomous robot navigation,” IEEE Instrum. Meas. Mag. 10 (6), 2631 (2007).
7.Lumelsky, V. J. and Stepanov, A. A., “Path-planning strategies for a point mobile automaton moving amidst obstacles of arbitrary shape,” Algorithmica 2, 403430 (1987).
8.Lumelsky, V. J., “Algorithmic and complexity issues of robot motion in an uncertain environment,” J. Complexity 3 (2), 146182 (1987).
9.Lumelsky, V. J. and Stepanov, A. A., “Dynamic path planning for a mobile automaton with limited information on the environment,” IEEE Trans. Automat. Contr. 31, 10581063 (1986).
10.Lumelsky, V. J. and Skewis, T., “Incorporating range sensing in the robot navigation function,” IEEE Trans.Syst. Man. Cybern. 20 (5), 10581069 (1990).
11.Kamon, I. and Rivlin, E., “Sensory-based motion planning with global proofs,” IEEE Trans. Robot. Autom. 13 (6), 10581068 (1997).
12.Kamon, I., Rimon, E. and Rivlin, E., “TangentBug: A range-sensor-based navigation algorithm,” Int. J. Robot. Res. 17 (9), 934953 (1998).
13.Noborio, H., Fujimura, K. and Horiuchi, Y., “A comparative study of sensor-based path-planning algorithms in an unknown maze,” Proc. IEEE/RSI Int. Conf. Intell. Robots Syst. 2, 909916 (2000).
14.Magid, E. and Rivlin, E., “Cautiousbug: A Competitive Algorithm for Sensory-Based Robot Navigation,” Proceedings of the 2004 IEEE International Conference on Intelligent Robots and Systems, Japan (Sep. 28–Oct. 2, 2004), pp. 27572762.
15.Sarid, S., Shapiro, A. and Gabriely, Y., “MRBUG: A Competitive Multi-Robot Path Finding Algorithm,” Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy (Apr. 10–14, 2007) pp. 877882.
16.Javier, A., Alberto, O. and Javier, M., “ABUG: A Fast Bug-derivative Anytime Path Planner with Provable Suboptimality Bounds,” Proceedings of the 2009 International Conference on Advanced Robotics, Palma, Spain (Jun. 22–26, 2009) pp. 18.
17.Choset, H. and Lynch, K. M., Principles of Robot Motion (The MIT Press, Cambridge, MA, 2005).
18.Lambrinos, D., Möller, R., Labhart, T., Pfeifer, R. and Wehner, R., “A mobile robot employing insect strategies for navigation,” Robot. Auton. Syst. 30 (1), 3964 (2000).
19.Franz, M. O. and Mallot, H. A., “Biomimetic robot navigation,” Robot. Auton. Syst. 30 (1), 133153 (2000).
20.Zenon, M., Miguel, L., and Calvo, J. M. Blanco, Dhir, A., Duff, A., i Badia, S. Bermúdez and Verschure, Paul F. M. J., “Insect-Like Mapless Navigation Based on Head Direction Cells and Contextual Learning Using Chemo-Visual Sensors,” Proceedings of the 2009 IEEE International Conference on Intelligent Robots and Systems (RSJ), St. Louis, MO, USA (Oct. 10–15, 2009) pp. 22432250.
21.Zenon, M., Verschure, Paul F. M. J., and Sergi, B. B., “An Insect-Based Method for Learning Landmark Reliability Using Expectation Reinforcement in Dynamic Environments,” Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, Anchorage, AK, USA (May 3–7, 2010) pp. 38053812.
22.Kimura, H., Fukuoka, Y. and Cohen, A. H., “Adaptive dynamic walking of a quadruped robot on natural ground based on biological concepts,” Int. J. Robot. Res. 26, 125 (2007).
23.Kuwana, Y., Shimoyama, I. and Miura, H., “Steering control of a mobile robot using insect antennae,” Proceedings of the 1995 IEEE International Conference on Intelligent Robots and Systems (RSJ), Tokyo, Japan (Aug. 5–9, 1995), pp. 530535.
24.Cowan, N. J., Ma, E. J. and Cutkosky, M. and Full, R. J., “A biologically inspired passive antenna for steering control of a running robot,” Robot. Res. Springer Tracts in Adv. Robot. 15, 541550 (2005).

Keywords

Cognitive response navigation algorithm for mobile robots using biological antennas

  • Jiliang Jiang (a1), Dawei Tu (a1), Shuo Xu (a1) and Qijie Zhao (a1)

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed