Land Cover Change Detection Using Texture Analysis
Xiaofeng Wu, Fen Yang and Roly Lishman
DOI : 10.3844/jcssp.2010.92.100
Journal of Computer Science
Volume 6, Issue 1
Problem statement: It is an important task to detect land cover changes from remotely sensed data for environmental monitoring. Although there are some applications of visual textures to the land use, they are limited to a few land cover categories with the application of one texture measure. Since land cover types are complex and often the integration of various objects, applying one texture measure to characterize land cover types is not possible. Approach: This study presented two types of texture measures for land cover types and applies them to detect possible land cover changes by discriminant analysis. The texture information of land cover types were modeled by different texture extraction methods, Laws Masks and Gabor filters. Laws Masks were designed to characterize the features in local neighborhoods. Moreover information in multi-channel of the spatial frequency domain was modeled by the Gabor filters with different orientations and spatial periods. The performance of these texture measures to detect land cover changes were evaluated by the discriminant analysis. Based on the transition matrix of land cover, the detection of land cover changes becomes to separate the land cover pair which is possible to derive conversion between them. The discriminant analysis was designed on a statistical test, which determines the contribution of the features attending the discrimination. Results: The experiments showed that this approach is capable of detecting changes and different measures are suitable to detect different changes. Conclusion: The experiment presented a textural guide for the change detection.
© 2010 Xiaofeng Wu, Fen Yang and Roly Lishman. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.