A novel method for Jinnan cattle individual classification based on deep mutual learning
As the core technology of precision animal husbandry, efficient and rapid identification of Jinnan cattle individuals can promote the scale, informatization and refinement of breeding, which is very necessary for the development of animal husbandry at this stage. However, the traditional livestock i...
Saved in:
Published in | Systems science & control engineering Vol. 11; no. 1 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Macclesfield
Taylor & Francis
31.12.2023
Taylor & Francis Ltd Taylor & Francis Group |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | As the core technology of precision animal husbandry, efficient and rapid identification of Jinnan cattle individuals can promote the scale, informatization and refinement of breeding, which is very necessary for the development of animal husbandry at this stage. However, the traditional livestock individual recognition method based on ear tag is labour-consuming, time-consuming, inefficient, easy to wear and limited by the recognition distance, and the accuracy is also very low. In order to solve this problem, a new method of Jinnan cattle individual recognition based on deep mutual learning is proposed by using the non-contact image recognition method. Two student networks are designed. They supervise each other and complete the task together. Their efficiency can be higher than that of a strong teacher network. Through this method, the individual recognition performance of Jinnan cattle is also enhanced. The experimental results verify the effectiveness of the method of deep mutual learning. Consult and learn from the peer network are used to improve generalization, so as to improve model recognition performance. Finally, the accuracy is 99.3% on the Jinnan cattle individual dataset established in this paper. The application of the contactless cattle individual recognition method in the farm is of great significance. |
---|---|
ISSN: | 2164-2583 2164-2583 |
DOI: | 10.1080/21642583.2023.2207587 |