Spatial segmentation for processing videos for farming automation
•This region segmentation method is easily adaptable to different farming applications.•The goal of segmenting farming videos is to partition the frame into different regions.•Trained on a small amount of data, quickly adapt to different applications.•Cameras are inside the cockpit to capture an ope...
Saved in:
Published in | Computers and electronics in agriculture Vol. 184; p. 106095 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Amsterdam
Elsevier B.V
01.05.2021
Elsevier BV |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •This region segmentation method is easily adaptable to different farming applications.•The goal of segmenting farming videos is to partition the frame into different regions.•Trained on a small amount of data, quickly adapt to different applications.•Cameras are inside the cockpit to capture an operator’s view.•Applied on classifying farming activities, automatic control on combine header.
A camera mounted on the front of a large agricultural machine captures a rich collection of visual data. Powerful cues about the upcoming field can be extracted through video processing. However, to access these cues requires methods to focus only on a specific region of the video frame, for example, the region containing the vehicle attachment or the upcoming field. To separate these different spatial regions in farming videos, this paper presents a spatial segmentation method using a rapidly-trained classifier. This classifier is trained on low-level hand-crafted features with limited data and can be easily adapted to different farming applications. We consider two applications here: classifying farming activities and automatic control to lift the header of a combine harvester. We demonstrate experimentally that the segmentation algorithm enables activity classification accuracy of 87%, as well as a prediction error of about 1.3 s on the correct time to lift the combine header. |
---|---|
ISSN: | 0168-1699 1872-7107 |
DOI: | 10.1016/j.compag.2021.106095 |