Convolutional Neural Network Based Approaches for Instance Segmentation of Irrigated Agriculture in Satellite Imagery
- Author(s): Avery, Ryan Barry
- Advisor(s): Caylor, Kelly
- et al.
Irrigated agriculture makes up the large majority of consumptive water use, and demand for water has greatly increased over the 21st century. To date, fine scale information about where irrigated agriculture is expanding is difficult to acquire due to: 1) a lack of manually delineated field boundaries due to: 1) a lack of centralized planning and management of agricultural development and 2) the limited ability of shallow machine learning models based on these limited data to generalize beyond small geographies or accurately map instances using remotely sensed imagery. Convolutional neural networks (CNNs) have been demonstrated to outperform shallower models with less learned non-linear feature transformations, such as decision tree based ensembles or SVMs, in image classification and segmentation of true color photography. To date there has been little research on the performance of CNN-based models for segmentation in the field of remote sensing. This thesis examines the performance of Mask R-CNN and Fully Convolutional Instance Aware Segmentation, two CNN-based methods for segmenting objects in images. There is a need to evaluate how these new methods perform in the remote sensing domain, given that images in these datasets tend to have a higher variance of objects and the geospatial dataset labels are more limited in number compared to large image corpuses, like ImageNet and COCO, which are used to test the performance of deep learning image recognition algorithms. Results show that true color Landsat 5 scenes can be used to produce sufficiently accurate instance detections of center pivot fields, even with a coarser resolution relative to newer sensors such as Sentinel-2. This opens the possibility of using Landsat’s long historical record for longitudinal studies of irrigation and cropping dynamics of center pivot agriculture.