Dietary Nutritional Information Autonomous Perception Method Based on Machine Vision in Smart Homes
In order to automatically perceive the user's dietary nutritional information in the smart home environment, this paper proposes a dietary nutritional information autonomous perception method based on machine vision in smart homes. Firstly, we proposed a food-recognition algorithm based on YOLO...
Saved in:
Published in | Entropy (Basel, Switzerland) Vol. 24; no. 7; p. 868 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI AG
24.06.2022
MDPI |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In order to automatically perceive the user's dietary nutritional information in the smart home environment, this paper proposes a dietary nutritional information autonomous perception method based on machine vision in smart homes. Firstly, we proposed a food-recognition algorithm based on YOLOv5 to monitor the user's dietary intake using the social robot. Secondly, in order to obtain the nutritional composition of the user's dietary intake, we calibrated the weight of food ingredients and designed the method for the calculation of food nutritional composition; then, we proposed a dietary nutritional information autonomous perception method based on machine vision (DNPM) that supports the quantitative analysis of nutritional composition. Finally, the proposed algorithm was tested on the self-expanded dataset CFNet-34 based on the Chinese food dataset ChineseFoodNet. The test results show that the average recognition accuracy of the food-recognition algorithm based on YOLOv5 is 89.7%, showing good accuracy and robustness. According to the performance test results of the dietary nutritional information autonomous perception system in smart homes, the average nutritional composition perception accuracy of the system was 90.1%, the response time was less than 6 ms, and the speed was higher than 18 fps, showing excellent robustness and nutritional composition perception performance. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1099-4300 1099-4300 |
DOI: | 10.3390/e24070868 |