Journal of Computer Science

Head Gesture Analysis using Matrix Group Displacement Algorithm

Mahmoud Z. Iskandarani

DOI : 10.3844/jcssp.2010.1362.1365

Journal of Computer Science

Volume 6, Issue 11

Pages 1362-1365

Abstract

A novel algorithm for head gestures interpretation is designed and tested. The designed system carries out gesture detection and recognition using the MGDA algorithm, which implements random sampling and importance sampling, such technique can track head poses and estimate head positions. Problem statement: Head position is an important indicator of a person’s focus of attention, which can be used as a key for multi-view face analysis assuming that face recognition and identification to be viewed dependently. This will help in selecting the best view model. Also, in the past few years face detection and person identification became important issues due to security concerns, leading to head gesture algorithm development and implementation. Approach: The captured image was allocated a map after which a file conversion process is carried out, allowing the next stage of image data conversion of head poses to be applied. This results in a specific number of matrices per pose that hold the necessary information. Such information was then allocated sequences representing head gesture poses which is combined for classification and correlation purposes to regenerate a predicted time reconstructed continuous movements. Results: A reliable, fast and robust approach for static head gesture recognition was achieved and presented. Conclusion: This very successful approach to head pose detection and gesture classification is strongly supported by its ability to correlate different signal input technologies as the devised algorithm can accommodate different inputs.

Copyright

© 2010 Mahmoud Z. Iskandarani. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.