Palmprint Recognition using Feature Level Fusion
R. Gayathri and P. Ramamoorthy
DOI : 10.3844/jcssp.2012.1049.1061
Journal of Computer Science
Volume 8, Issue 7
Problem statement: Palmprint based biometric method has gained high impact over the other biometric methods due to its ease of acquisition, reliability and high client acceptance. Multiple feature extraction from image gives higher accuracy of the authentication system. Approach: This study presents the palmprint based identification methodology which uses the Gabor wavelet entropy to extract multiple features existing on the palm print, by using a feature level fusion using Dempster-Shafer theory and are classified using nearest neighbor approach. A feature having the same vector can be grouped together using wavelet transform. A different feature of image using wavelet can be extracted. Some of the features that can be extracted using wavelet entropy consist of contrast, correlation, energy and homogeneity. The features are fused at feature levels. Palmprint matching is then performed by using the nearest neighbor classifier. Results and Conclusion: We selected 100 individuals left hand palm images; every person is 6 and the total is 600. Later we got every person each palm image as a template (total 100). The remaining 500 were treated as the training samples. The experimental results achieve recognition accuracy of 98.6% and interesting working point with False Acceptance Rate (FAR) of = 0.03% and False Rejection Rate (FRR) of = 1.4% on the publicly available database of The Hong Kong Polytechnic University. Experimental assessment using palmprint image databases clearly validates the efficient recognition performance of the suggested algorithm compared with the conventional palmprint recognition algorithms.
© 2012 R. Gayathri and P. Ramamoorthy. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.