Journal of Computer Science

Optimised Implementation of Dense Stereo Correspondence for Resource Limited Hardware

Deepambika Vadekkettathu Anirudhan and Malangai Abdul Rahiman

DOI : 10.3844/jcssp.2018.1303.1317

Journal of Computer Science

Volume 14, Issue 10

Pages 1303-1317

Abstract

Computer stereo vision is a passive sensing technique that helps to recover 3D information of an environment from 2D images. The stereo correspondence is a challenging task that finds out matching pixels between the stereo image pair based on Lambertian criteria and its result is a disparity space image. The depth of the objects from the camera can be calculated from this disparity value by using the principle of triangulation. For the vision guided robot navigation, the requirement of stereo matching algorithms on low power dedicated hardware that can achieve a high frame rate is unambiguous. A new, highly optimized implementation of correlation based, Sum of Absolute Differences correspondences algorithm on a low cost resource limited FPGA is presented here. This System-on-Programmable-Chip architecture based system achieved a higher frame rate of 50 fps with 64 disparity levels without using a microprocessor. On performance evaluation, the disparity map shows a maximum error of 0.308% only. This reconfigurable parallel processing, high speed architecture of the algorithm implementation takes only 43% of available resources of low density Altera Cyclone II. This hardware implementation of stereo vision system outperforms in terms of accuracy, speed and resource utilization of all the other existing stereo systems of its similar kind. Also, it offers a better trade-off between run-time speed and accuracy and is found suitable for most of the range finding real-time applications.

Copyright

© 2018 Deepambika Vadekkettathu Anirudhan and Malangai Abdul Rahiman. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.