American Journal of Applied Sciences

A Hybrid Retinal Image Registration Using Mutual Information

Pradeepa Palraj and Ila Vennila

DOI : 10.3844/ajassp.2013.1386.1391

American Journal of Applied Sciences

Volume 10, Issue 11

Pages 1386-1391

Abstract

Image registration has tremendous applications in medical industry both in diagnosis and therapy. But registering the retinal images is very difficult task because of the structural and illumination criteria. Normal registration algorithms are proved to be less effective and more complicated for such applications. This study aims at combining two methods of retinal image registration: Vessel based and non-vessel based registration to overcome the drawbacks of the existing algorithms. This combined method of registration is efficient and less time consuming when compared to the techniques applied individually. This hybrid technique of retinal image registration has fast convergence in diagnosing retinal occlusion which is very common in diabetic patients of temperate countries. The dye injected retinal image to be registered against the normal red free image is first preprocessed and the bifurcations are extracted using image segmentation. Then the matching criterion between the two segmented images is determined using mutual information. Extraction of bifurcations will make the process of selecting the matching criterion easier by reducing the search space. The best angle of rotation that matches the base image and the reference image is optimized using Simulated Annealing (SA) and this is compared with the results obtained by using Genetic Algorithm (GA). The experimental results have shown that the computational time is reduced when this hybrid technique of registration is implemented with SA when compared to GA.

Copyright

© 2013 Pradeepa Palraj and Ila Vennila. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.