Multi-class fruit ripeness detection using YOLO and SSD object detection models
Accurate fruit ripeness detection is critical to reducing post-harvest losses and improving quality control in agricultural systems. This study benchmarks four object detection models—YOLOv5, YOLOv6, YOLOv7, and SSD-MobileNetv1—for multi-class ripeness classification of strawberries and avocados acr...
Saved in:
Published in | Discover applied sciences Vol. 7; no. 9; pp. 931 - 17 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Cham
Springer International Publishing
01.09.2025
Springer Nature B.V Springer |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Accurate fruit ripeness detection is critical to reducing post-harvest losses and improving quality control in agricultural systems. This study benchmarks four object detection models—YOLOv5, YOLOv6, YOLOv7, and SSD-MobileNetv1—for multi-class ripeness classification of strawberries and avocados across four stages: unripe, partially ripe, ripe, and rotten. The dataset, captured under natural conditions, has been manually annotated and published for public access. YOLOv6 achieved the highest mean Average Precision (99.5%) and demonstrated a strong balance between accuracy and real-time inference speed (85.2 FPS). All models were evaluated using standard classification metrics and cross-validated through a 5-fold approach to ensure robustness. The results indicate YOLOv6 as the most reliable model for smart fruit sorting and quality monitoring applications. This study offers a reproducible benchmarking pipeline and contributes toward the development of deployable deep learning solutions in precision agriculture.
Article highlights
Evaluated AI models to detect fruit ripeness stages using authentic images of strawberries and avocados.
YOLOv6 stood out for both accuracy and speed, making it ideal for fruit sorting and quality checks.
A labelled dataset was shared to support future research in smart farming and food waste reduction. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 3004-9261 2523-3963 3004-9261 2523-3971 |
DOI: | 10.1007/s42452-025-07617-7 |