Image Registration of Blown and Swayed Trees

Article Preview

Abstract:

This paper concerns the problem of blown and swayed tree image registration. The swayed tree leaves and little branches contain plenty of details or feature points, but matching with these points may result in error image registration. In this work, we delete some details of leaves and little and thin branches by using of morphological image processing methods, then extract the relatively invariant crotch features of trees from thinning images, lastly we adjusted the matching points with L-M algorithm. Results show our method is insensitive to the ordering, rotation and scale of the input images so it can be used in image stitching and retrieval of images & videos.

You might also be interested in these eBooks

Info:

Periodical:

Key Engineering Materials (Volumes 474-476)

Pages:

2183-2188

Citation:

Online since:

April 2011

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2011 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Harris, C. Geometry from visual motion. In Blake, A. and Yuille,A., (eds. ), Active Vision. MIT Press, (1992) p.263–284.

Google Scholar

[2] SmithAM,Brady J M.SUSAN. A new approach to low level image processing. 1ntemational Journal of Commputer Vision,Vol. 23-1 (1997) p.45—78.

Google Scholar

[3] Lown D G. Distinctive image features from scale-invariant keypoints. International Journal of ComputerVision, Vol. 60-2 (2004), pp.91-110.

Google Scholar

[4] Rafael c. Gonzalez, Richard E. Woods. Digital image processing second Edition, Beijing: Publishing house of electronics industry(2008), pp.519-560.

Google Scholar

[5] Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Systems, Man, and Cybernetics, 9(1) (1979), pp.62-66.

DOI: 10.1109/tsmc.1979.4310076

Google Scholar

[6] Matthew Brown, David G. Low. Automatic Panoramic Image Stitching using Invariant Features". International Journal of Computer Vision 74(1) (2007), 59–73.

DOI: 10.1007/s11263-006-0002-3

Google Scholar