A Two-Stage Algorithm of High Resolution Image Alignment for Mobile Applications

HTML  XML Download Download as PDF (Size: 2184KB)  PP. 36-51  
DOI: 10.4236/jcc.2016.44004    2,213 Downloads   3,163 Views  Citations

ABSTRACT

Global motion estimation (GME) algorithms are widely applied to computer vision and video processing. In the previous works, the image resolutions are usually low for the real-time requirement (e.g. video stabilization). However, in some mobile devices applications (e.g. image sequence panoramic stitching), the high resolution is necessary to obtain satisfactory quality of panoramic image. However, the computational cost will become too expensive to be suitable for the low power consumption requirement of mobile device. The full search algorithm can obtain the global minimum with extremely computational cost, while the typical fast algorithms may suffer from the local minimum problem. This paper proposed a fast algorithm to deal with 2560 × 1920 high-resolution (HR) image sequences. The proposed method estimates the motion vector by a two-level coarse-to-fine scheme which only exploits sparse reference blocks (25 blocks in this paper) in each level to determine the global motion vector, thus the computational costs are significantly decreased. In order to increase the effective search range and robustness, the predictive motion vector (PMV) technique is used in this work. By the comparisons of computational complexity, the proposed algorithm costs less addition operations than the typical Three-Step Search algorithm (TSS) for estimating the global motion of the HR images without the local minimum problem. The quantitative evaluations show that our method is comparable to the full search algorithm (FSA) which is considered to be the golden baseline.

Share and Cite:

Huang, R. , Dung, L. and Hong, T. (2016) A Two-Stage Algorithm of High Resolution Image Alignment for Mobile Applications. Journal of Computer and Communications, 4, 36-51. doi: 10.4236/jcc.2016.44004.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.