American Journal of Engineering and Applied Sciences

Fragment-based Visual Tracking with Multiple Representations

Junqiu Wang and Yasushi Yagi

DOI : 10.3844/ajeassp.2016.187.194

American Journal of Engineering and Applied Sciences

Volume 9, Issue 2

Pages 187-194

Abstract

We present a fragment-based tracking algorithm that considers appearance information characterized by a non-parametric distribution and spatial information described by a parametric representation. We segment an input object into several fragments based on the appearance similarity and spatial distribution. Spatial distribution and appearance are important for distinguishing different fragments. We employee such information for separating an object from its background: Appearance information is described by nonparametric representation such as kernels; spatial information is characterized by Gaussians with spatial distribution of fragments. We integrate appearance and spatial information for target localization in images. The overall motion is estimated by the mean shift algorithm. This motion can deviate from the true position in the overall motion estimation because of the mean-shift drifting. We refine the estimated position based on the foreground probabilities. The proposed tracker gives better target localization results and better foreground probability images. Our experimental results demonstrate that the integration of appearance and spatial information by combining parametric and non-parametric representation is effective for tracking targets in difficult sequences.

Copyright

© 2016 Junqiu Wang and Yasushi Yagi. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.