English   español  
Please use this identifier to cite or link to this item: http://hdl.handle.net/10261/127906
logo share SHARE logo core CORE   Add this article to your Mendeley library MendeleyBASE

Visualizar otros formatos: MARC | Dublin Core | RDF | ORE | MODS | METS | DIDL | DATACITE
Exportar a otros formatos:

DC FieldValueLanguage
dc.contributor.authorPeña, José María-
dc.contributor.authorGutiérrez, Pedro Antonio-
dc.contributor.authorHervás-Martínez, César-
dc.contributor.authorSix, Johan-
dc.contributor.authorPlant, Richard E.-
dc.contributor.authorLópez Granados, Francisca-
dc.identifierissn: 2072-4292-
dc.identifier.citationRemote Sensing 6(6): 5019- 5041 (2014)-
dc.description.abstractThe strategic management of agricultural lands involves crop field monitoring each year. Crop discrimination via remote sensing is a complex task, especially if different crops have a similar spectral response and cropping pattern. In such cases, crop identification could be improved by combining object-based image analysis and advanced machine learning methods. In this investigation, we evaluated the C4.5 decision tree, logistic regression (LR), support vector machine (SVM) and multilayer perceptron (MLP) neural network methods, both as single classifiers and combined in a hierarchical classification, for the mapping of nine major summer crops (both woody and herbaceous) from ASTER satellite images captured in two different dates. Each method was built with different combinations of spectral and textural features obtained after the segmentation of the remote images in an object-based framework. As single classifiers, MLP and SVM obtained maximum overall accuracy of 88%, slightly higher than LR (86%) and notably higher than C4.5 (79%). The SVM+SVM classifier (best method) improved these results to 89%. In most cases, the hierarchical classifiers considerably increased the accuracy of the most poorly classified class (minimum sensitivity). The SVM+SVM method offered a significant improvement in classification accuracy for all of the studied crops compared to the conventional decision tree classifier, ranging between 4% for safflower and 29% for corn, which suggests the application of object-based image analysis and advanced machine learning methods in complex crop classification tasks.-
dc.description.sponsorshipThis research was partly financed by the TIN2011-22794 project of the Spanish Ministerial Commission of Science and Technology (MICYT), FEDER funds, the P2011-TIC-7508 project of the “Junta de Andalucía” (Spain) and the Kearney Foundation of Soil Science (USA). The research of Peña was co-financed by the Fulbright-MEC postdoctoral program, financed by the Spanish Ministry for Science and Innovation, and by the JAEDoc Program, supported by CSIC and FEDER funds. ASTER data were available to us through a NASA EOS scientific investigator affiliation.-
dc.description.sponsorshipWe acknowledge support by the CSIC Open Access Publication Initiative through its Unit of Information Resources for Research (URICI).-
dc.publisherMultidisciplinary Digital Publishing Institute-
dc.relation.isversionofPublisher's version-
dc.subjectHierarchical classification-
dc.subjectNeural networks-
dc.subjectASTER satellite images-
dc.subjectObject-oriented image analysis-
dc.titleObject-Based Image Classification of Summer Crop with Machine Learning Methods-
dc.description.versionPeer Reviewed-
dc.contributor.funderMinisterio de Ciencia y Tecnología (España)-
dc.contributor.funderUniversity of California-
dc.contributor.funderJunta de Andalucía-
dc.contributor.funderEuropean Commission-
dc.contributor.funderMinisterio de Ciencia e Innovación (España)-
dc.contributor.funderConsejo Superior de Investigaciones Científicas (España)-
dc.contributor.funderMinisterio de Educación y Cultura (España)-
Appears in Collections:(IAS) Artículos
Files in This Item:
File Description SizeFormat 
Object-Based Image Classification_Peña.pdf1,71 MBAdobe PDFThumbnail
Show simple item record

Related articles:

WARNING: Items in Digital.CSIC are protected by copyright, with all rights reserved, unless otherwise indicated.