Close Range Photogrammetry and Neural Network for Facial Recognition
Rami Al-Ruzouq and Shatha Kadhim
DOI : 10.3844/ajassp.2012.1542.1552
American Journal of Applied Sciences
Volume 9, Issue 10
Recently, there has been an increasing interest in utilizing imagery in different fields such as archaeology, architecture, mechanical inspection and biometric identifiers where face recognition considered as one of the most important physiological characteristics that is related to the shape and geometry of the faces and used for identification and verification of a person's identity. In this study, close range photogrammetry with overlapping photographs were used to create a three dimensional model of human face where coordinates of selected object points were exatrcted and used to caculate five different geometric quantities that been used as biometric authentication for uniquely recognizing humans. Then , the probabilistic neural networks, with their remarkable ability to derive meaning from complicated or imprecise data, utilize the extracted geometric quantities to find patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. Quantifiable dimensions that based on geometric attributes rather than radiometric characteristics has been successfully extracted using close range photogrammetry. the Probabilistic Neural Network (PNN) as a kind from radial basis network group has been used to specify a geometrics parameters for face recognition where the designed recognition method is not effected by face gesture or color and has lower cost compared with other techniques. This method is reliable and flexible with respect to the level of detail that describe the human surface. Experimental results using real data proved the feasibility and the quality of the suggested approach.
© 2012 Rami Al-Ruzouq and Shatha Kadhim. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.