Optical Flow-Based Monocular Vision/INS Integrated Navigation for Mobile Robot Indoors

Article Preview

Abstract:

This paper proposed a new algorithm for optical flow-based monocular vision (MV)/ inertial navigation system (INS) integrated navigation. In this mode, a downward-looking camera is used to get the image sequences, which is used to estimate the velocity of the mobile robot by using optical flow algorithm. INS is employed for the yaw variation. In order to evaluate the performance of the proposed method, a real indoor test has done. The result shows that the proposed method has good performance for velocity estimation. It can be applied to the autonomous navigation of mobile robots when the Global Positioning System (GPS) and code wheel is unavailable.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

375-378

Citation:

Online since:

April 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Guillaume Blanc, Youcef Mezouar, Philippe Martinet. Indoor Navigation of a Wheeled Mobile Robot along Visual Routes,. Proceedings of the 2005 IEEE International Conference on Robotics and Automation. Barcelona, Spain, April 2005, p.3354 – 3359.

DOI: 10.1109/robot.2005.1570628

Google Scholar

[2] W. Wang and D. Wang, Land vehicle navigation using odometry/ins/vision integrated system, in Proceedings 2008 IEEE Conference on Cybernetics and Intelligent Systems, Chengdu, China, 2008, p.754–759.

DOI: 10.1109/iccis.2008.4670839

Google Scholar

[3] X. Song, L. Seneviratne, and K. Althoefer, A kalman filterintegrated optical flow method for velocity sensing of mobile robots, IEEE/ASME Transactions on Mechatronics, vol. 16, p.551 –563, Jun (2011).

DOI: 10.1109/tmech.2010.2046421

Google Scholar

[4] E. Royer, M. Lhuillier, M. Dhome, and J. -M. Lavest, Monocular vision for mobile robot localization and autonomous navigation, Int. J. Comput. Vis., vol. 74, no. 3, p.237–260, (2007).

DOI: 10.1007/s11263-006-0023-y

Google Scholar

[5] J. Shi, C. Tomasi, Good Features to Track, in IEEE Int. Conf. Computer Vision and Pattern Recognition., 1994, pp.593-600.

Google Scholar

[6] Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the algorithm Jean-Yves Bouguet. Intel Corporation Microprocessor Research Labs jean-yves. bouguet@intel. com.

Google Scholar

[7] S. N Tamgade, V.R. Bora, Motion Vector Estimation of Video Image by Pyramidal Implementation of Lucas Kanade Optical Flow , Second International Conference on Emerging Trends in Engineering& Technology , (2009).

DOI: 10.1109/icetet.2009.154

Google Scholar

[8] X. Song, L. D. Seneviratne, K. Althoefer and Z. Song, Vision-Based Velocity Estimation for Unmanned Ground Vehicles, Int. Journal of Information Acquisition, 2007, vol. 4, no. 4, pp.303-315.

DOI: 10.1142/s021987890700137x

Google Scholar