Automatic Weed Detection Method for Smart Agricultural Systems Based on Deep Learning
Main Article Content
Abstract
An automatic weed detection algorithm combining an improved single-stage target detection algorithm (You Only Look Once version 5s, YOLOv5s) and traditional image processing methods is proposed to solve problems such as low meristem targeting positioning accuracy in smart agricultural system methods, limited resources of deep learning in agricultural scenarios, and slow real-time speed. To reduce parameters, the GhostNet module is introduced to compress the model size. Additionally, the CA attention mechanism is added to the backbone layer to enhance the feature extraction ability of the model. The bi-directional feature pyramid network (BiFPN) is adopted in the neck layer of feature fusion to improve the detection accuracy of small targets. The experimental results revealed that the accuracy of the improved YOLOv5s model was 96.1%, which was 1%, 4.6%, and 9.5% higher than that of YOLOv5s, Faster Region-based Convolutional Network (Faster RCNN), and YOLOv4-tiny, respectively. At the same time, the model size decreased significantly, and the number of parameters decreased by 43% compared to the original model. The improved YOLOv5s target detection algorithm was employed to identify weeds. Then the super-green feature combined with the OTSU threshold segmentation algorithm was used to extract the foreground image of the identified weeds within the detection frame. Finally, the centroid coordinates were output through contour detection and centroid calculation as the position of the weed meristem.
Article Details

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.