Journal of Computer Science

SHAPE RETRIEVAL THROUGH MAHALANOBIS DISTANCE WITH SHORTEST AUGMENTING PATH ALGORITHM

S. Muruganathan, N. Devarajan, D. Chitra and T. Manigandan

DOI : 10.3844/jcssp.2014.552.562

Journal of Computer Science

Volume 10, Issue 4

Pages 552-562

Abstract

Shape matching and object recognition plays an vital role in the computer vision. The shape matching is difficult in case of the real world images like mpeg database images since the real world images has the internal and external contours. The Mahalanobis distance based shape context approach is proposed to measure similarity between shapes and exploit it for shape retrieval. The process of shape retrieval identifies the relevant shapes from the data base for the query images. The query image matched with the reference images and it gives the dissimilarity between the shapes. This dissimilarity measures used to identify the relevant images from the databases. The dissimilarity is distance between the two images. The shape matching has the three major steps that are finding correspondence, measusring distance and the applying allinging transformation. The finding correspondence is find the best matching point between the query image and the reference image, The correspondence is solved by the shape context with Shortest augmenting path algorithm. The measuring distance is used to find the distance between the corresponding point. In this study, Mehalanobis distance is used to find the distance between the images. The alligning transformation is used to allign the shapes in order to achieve the best matching point. Object recognition is achieved by the k-nearest neighbor algorithm. The proposed method is simple, invariant to noise and gives better error rate compared to the existing methods.

Copyright

© 2014 S. Muruganathan, N. Devarajan, D. Chitra and T. Manigandan. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.