A vision-based method to estimate volume and mass of fruit/vegetable: Case study of sweet potato

Among physical attributes, dimensions and weight of agricultural products are important parameters that are useful for designing grading/packaging systems. Size and weight measurements usually require extensive time and labor. Moreover, it is often difficult to accurately do the measurements for the...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of food properties Vol. 25; no. 1; pp. 717 - 732
Main Authors Huynh, Tri T. M., TonThat, Long, Dao, Son V. T.
Format Journal Article
LanguageEnglish
Published Abingdon Taylor & Francis 31.12.2022
Taylor & Francis Ltd
Taylor & Francis Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Among physical attributes, dimensions and weight of agricultural products are important parameters that are useful for designing grading/packaging systems. Size and weight measurements usually require extensive time and labor. Moreover, it is often difficult to accurately do the measurements for the products having irregular shapes, such as sweet potatoes. Even though there are many previous works on the topic, they either lack accuracy or require multiple captures and computing power to reconstruct the three-dimensional representation of the products. This paper proposes the measurement of sweet potato features, including the body length, width, and thickness, by using a simple vision system that requires only a single camera to capture a top-view image of the products. After the background segmentation step, the product is virtually sliced into many equal slices along the longitudinal axis. The volume of the product is calculated as the summation of the volumes of these individual slices. Since there is a high correlation between the products' weight and volume, the weight of the product can be calculated. The experimental results obtained show that the proposed approach is accomplished with highly competitive results, which have an accuracy of up to 96% (with an R 2 of 0.98) for the volume estimation task and an accuracy of up to 95% (with an R 2 of 0.96) for the weight estimation task. Due to its simplicity, this model can be used to design and develop the sizing/weighing/packaging systems.
ISSN:1094-2912
1532-2386
DOI:10.1080/10942912.2022.2057528