Seasonal Multi-temporal Pixel Based Crop Types and Land Cover Classification for Satellite Images using Convolutional Neural Networks

Nowadays, Satellite images have become a major source of data for many aspects of development. Land and crops classification using satellite images is a recent important subject. From the other side, Deep Convolutional Neural Networks (DCNNs) is a powerful technique for understanding images. This pa...

Full description

Saved in:
Bibliographic Details
Published in2018 13th International Conference on Computer Engineering and Systems (ICCES) pp. 21 - 26
Main Authors Laban, Noureldin, Abdellatif, Bassam, Ebeid, Hala M., Shedeed, Howida A., Tolba, Mohamed F.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.12.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Nowadays, Satellite images have become a major source of data for many aspects of development. Land and crops classification using satellite images is a recent important subject. From the other side, Deep Convolutional Neural Networks (DCNNs) is a powerful technique for understanding images. This paper describes a pixel based crops and land cover classification originating from one source satellite imagery represented by Sentinel satellite and based on several dates for the same agricultural season. We propose a DCNN architecture based on multi-temporal data that was fed to a one-dimension (1-D DCNN). The proposed architecture is compared with other methods of satellite image classification algorithms; such as Support Vector Machines (SVMs), Random Forests (RFs) and k-Nearest Neighbors (k-NNs). Experiments are conducted for the mutual experiment of major crops and land cover classification for Al-Fayoum governorate in Egypt. The 1-D DCNN achieves about 89% accuracy using 10 spectral bands from Sentinel-2 satellite imagery database for the area of interest. The proposed architecture although it outperforms other methods, needs further research to optimize the memory usage.
DOI:10.1109/ICCES.2018.8639232