Deep learning for automated fish grading

Fish is a staple food around the globe, and its quality is heavily dependent on freshness. The conventional method for evaluating the quality of fish is through a visual inspection of a sample. However, this approach heavily depends on human senses for precise assessment, leading to a susceptibility...

Full description

Saved in:
Bibliographic Details
Published inJournal of agriculture and food research Vol. 14; p. 100711
Main Authors Jayasundara, J.M.V.D.B., Ramanayake, R.M.L.S., Senarath, H.M.N.B., Herath, H.M.S.L., Godaliyadda, G.M.R.I., Ekanayake, M.P.B., Herath, H.M.V.R., Ariyawansa, S.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.12.2023
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Fish is a staple food around the globe, and its quality is heavily dependent on freshness. The conventional method for evaluating the quality of fish is through a visual inspection of a sample. However, this approach heavily depends on human senses for precise assessment, leading to a susceptibility to variability in accuracy and efficiency. Moreover, the potential for safety to be compromised due to errors in the evaluation process makes it a less reliable method. This work presents two Neural Network (NN) architectures, FishNET-S and FishNET-T, to evaluate the quality of the Indian Sardinella and the Yellowfin Tuna, respectively, using RGB images captured from smartphone cameras. The FishNET-S is based on the VGG-16 with the introduction of a Block Attention Module (BAM) to drive the network towards learning the features related to fish quality evaluation from the eye region of the entire fish. In contrast, the FishNET-T architecture first performs a color decomposition based on hue, saturation, and intensity transformations before forwarding the hue and saturation components to the CNN in order to effectively identify grades through fish meat. Experimentally, FishNET-S has managed to obtain an accuracy of 84.1%, while FishNET-T yielded an accuracy of 68.3%. The comparison analysis carried out with the use of generic machine learning and state of the art deep learning models shows that the performance of the proposed novel architectures is dominant and unchallenged. [Display omitted] •Neural networks prove effective when grading fish using images.•Block attention module enhances fish quality grading accuracy in neural networks.•Fish meat color is vital in the quality grading of fish via neural networks.•Deep learning outperforms machine learning for fish quality grading using images.
ISSN:2666-1543
2666-1543
DOI:10.1016/j.jafr.2023.100711