Real-Time Detection of Ripe Oil Palm Fresh Fruit Bunch based on YOLOv4

Fresh Fruit Bunch (FFB) is the main ingredient in palm oil production. Harvesting FFB from oil palm trees at its peak ripeness stage is crucial to maximise the oil extraction rate (OER) and quality. In current harvesting practices, misclassification of FFB ripeness can occur due to human error, resu...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 10; p. 1
Main Authors Lai, Jin Wern, Ramli, Hafiz Rashidi, Ismail, Luthffi Idzhar, Hasan, Wan Zuha Wan
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Fresh Fruit Bunch (FFB) is the main ingredient in palm oil production. Harvesting FFB from oil palm trees at its peak ripeness stage is crucial to maximise the oil extraction rate (OER) and quality. In current harvesting practices, misclassification of FFB ripeness can occur due to human error, resulting in OER loss. Therefore, a vision-based ripe FFB detection system is proposed as the first step in a robotic FFB harvesting system. In this work, live camera input is fed into a Convolutional Neural Network (CNN) model known as YOLOv4 to detect the presence of ripe FFBs on the oil palm trees in real time. Once a ripe FFB is detected on the tree, a signal is transmitted via ROS to the robotic harvesting mechanism. To train the YOLOv4 model, a large number of ripe FFB images were collected using an Intel Realsense Camera D435 with the resolution of 1920×1080. During data acquisition, a subject matter expert assisted in classifying the FFBs in terms of ripe or unripe. During the testing phase, the result of the mean Average Precision (mAP) and recall are 87.9 % and 82 % as the detection is fulfilled the Intersect over Union (IoU) with more than 0.5 after 2000 iterations and the system operated at the real-time speed of roughly 21 Frame Per Second (FPS).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3204762