American Journal of Agricultural and Biological Sciences

DEVELOPMENT OF A MACHINE VISION SYSTEM FOR WEED DETECTION DURING BOTH OF OFF-SEASON AND IN-SEASON IN BROADACRE NO-TILLAGE CROPPING LANDS

Huajian Liu, Sang Heng Lee and Chris Saunders

DOI : 10.3844/ajabssp.2014.174.193

American Journal of Agricultural and Biological Sciences

Volume 9, Issue 2

Pages 174-193

Abstract

More than half of the Australian cropping land is no-tillage and weed control within continuous no-tillage agricultural cropping area is becoming more and more difficult. A major problem is that the heavy herbicide usage causes some of more prolific weeds becoming more resistant to the regular herbicides and therefore more powerful and more expensive options are being pursued. To overcome such problems with aiming at the reduction of herbicide usage, this proposed research focuses on developing a machine vision system which can detect and mapping weeds or do spot spray. The weed detection methods described in this study include three aspects which are image acquisition, a new green plant detection algorithm using hybrid spectral indices and a new inter-row weed detection method taking the advantage of the location of the crop rows. The developed method could detect the weeds both during the non-growing summer period and also within the growing season until the canopy of the crop has closed. The design of the methods focuses on overcoming the challenges of the complex no-tillage background, the faster image acquisition speed and quicker processing time for real-time spot spray. The experiment results show that the proposed method are more suitable for the weed detection in the no-tillage background than the existing methods and could be used as a powerful tool for the weed control.

Copyright

© 2014 Huajian Liu, Sang Heng Lee and Chris Saunders. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.