Object Oriented Image Analysis in Remote Sensing of Forest and Vineyard Areas

The study of vegetation cover, forests, orchards or vineyards and crops through satellite techniques is increasingly promoted as a result of facilities they offer. A number of methods and techniques for processing and analysis of satellite images are developed to increase the precision of the workin...

Full description

Saved in:
Bibliographic Details
Published inBulletin of University of Agricultural Sciences and Veterinary Medicine Cluj-Napoca. Horticulture Vol. 72; no. 2
Main Authors Govedarica, Miro, Ristic, Aleksandar, Jovanovic, Dušan, Herbei, Mihai Valentin, Sala, Florin
Format Journal Article
LanguageEnglish
Published 27.11.2015
Online AccessGet full text

Cover

Loading…
More Information
Summary:The study of vegetation cover, forests, orchards or vineyards and crops through satellite techniques is increasingly promoted as a result of facilities they offer. A number of methods and techniques for processing and analysis of satellite images are developed to increase the precision of the working, given the diversity of vegetation structure analysis and expected results. This study aimed to analyze the capabilities of object-oriented image analysis (OBIA) for recognition forest and vineyard areas. OBIA is automated process of object extraction by modelling of human visual system for image interpretation. The basis for classification process is object, which is created according to the set of characteristics. In object-oriented approach classification description is based on classification rules including spectral characteristics, size, shape, as well as content and texture information. Analysis is done on multispectral imagery of high and very high spatial resolution. Represented results show the usefulness of RapidEye and WorldView2 images as well as importance of classification based on OBIA. Object-oriented image analysis (OBIA) method based on satellite imagery has facilitated the recognition forest and vineyard areas with high accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1843-5254
1843-5394
DOI:10.15835/buasvmcn-hort:11409