Optical Flow Vectors Thresholding in Assisting Heading Direction Estimation

Article Preview

Abstract:

This paper presents a technique on how to estimate heading direction of a moving vision system such as mobile robot. The heading direction is represented by estimation angle which is generated by optical flow vectors threshold technique (OFVTT). The utilizations of optical flow field generated based on Horn-Schunck and Lucas-Kanade methods are essential in order to compute the threshold value. The performance of the proposed technique was determined through percentage of root mean square error (RMSE). Based on our experimental results, it can be ascertained that combination of Horn-Schunck method together with OFVTT shows the best performance in assisting the heading direction estimation of a moving vision system.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

623-628

Citation:

Online since:

September 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] M. Tomizuka, Mechatronics: from the 20th to 21st century, Control Engineering Practice 10 (2002) 877–886.

DOI: 10.1016/s0967-0661(02)00016-3

Google Scholar

[2] D. Bradley, Mechatronics – More questions than answers, Mechatronics 20 (2010) 827–841.

DOI: 10.1016/j.mechatronics.2010.07.011

Google Scholar

[3] V. Giurgiutiu, A. E. Bayoumi, and G. Nall, Mechatronics and smart structures : emerging engineering disciplines for the third millennium, Mechatronics 12 (2002) 169–181.

DOI: 10.1016/s0957-4158(01)00057-5

Google Scholar

[4] J. Hewit, Autonomous mechatronics design - the key to performance enhancement, Robotics and Autonomous Systems 19 (1996) 135–142.

DOI: 10.1016/s0921-8890(96)00041-3

Google Scholar

[5] A. Ollero, S. Boverie, R. Goodall, J. Sasiadek, H. Erbe, and D. Zuehlke, Mechatronics, robotics and components for automation and control, Annual Reviews in Control 30 (2006) 41–54.

DOI: 10.1016/j.arcontrol.2006.02.002

Google Scholar

[6] S. B. Niku, Introduction to Robotics Analysis, Systems, Applications, New Jersey, USA: Prentice Hall Inc, (2001).

Google Scholar

[7] K. H. Ghazali, M. M. Mustafa, and A. Hussain, Machine vision system for automatic weeding strategy using image processing technique, American-Eurasian Journal of Agricultural & Environmental Sciences 3 (2008) 451–458.

Google Scholar

[8] N. Nourani-vatani, J. Roberts, and M. V Srinivasan, Practical visual odometry for car-like vehicles, IEEE International Conference on Robotics and Automation 2009 (ICRA '09) (2009) 3551 – 3557.

DOI: 10.1109/robot.2009.5152403

Google Scholar

[9] A. Pretto, E. Menegatti, M. Bennewitz, W. Burgard, and E. Pagello, A visual odometry framework robust to motion blur, IEEE International Conference on Robotics and Automation 2009 (ICRA '09) (2009) 2250 – 2257.

DOI: 10.1109/robot.2009.5152447

Google Scholar

[10] K. Konolige, M. Agrawal, and J. Solà, Large scale visual odometry for rough terrain, Springer Tracts in Advanced Robotics 66 (2011) 201–212.

DOI: 10.1007/978-3-642-14743-2_18

Google Scholar

[11] D. Hyun, H. Seok, H. Ri, and H. Park, Differential optical navigation sensor for mobile robots, Sensors and Actuators A: Physical 156 (2009) 296–301.

DOI: 10.1016/j.sna.2009.10.007

Google Scholar

[12] A. M. Tekalp, Digital Video Processing, Prentice Hall Inc, (1995).

Google Scholar