Automated Weed Detection Using YOLOv8: A Deep Learning Approach for Indian Agriculture

Weed detection in manual phase is a very hectic and challenging task. People do not have sufficient information about the weed and their types. Numerous weeds are found in the Indian subcontinent and it is very hard to recognize them manually. More difficult task is to remember the names of the weed...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Signal Processing and Communication (Online) pp. 289 - 294
Main Authors Singh, Somay Raj, Sakya, Gayatri, Malik, Monika, Grover, Chhaya
Format Conference Proceeding
LanguageEnglish
Published IEEE 20.02.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Weed detection in manual phase is a very hectic and challenging task. People do not have sufficient information about the weed and their types. Numerous weeds are found in the Indian subcontinent and it is very hard to recognize them manually. More difficult task is to remember the names of the weeds. Research done on this field is very limited. So, to full fill the gap for this problem this research has been done to develop a Deep Learning model which can identify the weeds using image processing. It beautifully illustrates the combine usage of the modern technologies and their real-life implementations to make things easy in agricultural field. The technologies such as Image processing and deep learning are used. These technologies blends together to give a good response rate in weed detection. First step is, to collect data from the fields. Secondly data preprocessing and augmentation should be done. Third is classification of the data as training data and testing data and then feed it to yolo v8 library. Dataset of 1000 images of a weed in the ratio 80:20 should be used, 800 for the training of the model and 200 for testing. One can use YOLOv8 python library for training the data and Jupyter Notebook as computing platform. Proposed model can easily get an accuracy around 74 percent.
ISSN:2643-444X
DOI:10.1109/ICSC64553.2025.10967853