Weed location and recognition based on UAV imaging and deep learning

Rufei Zhang, Cong Wang, Xiaoping Hu, Yixue Liu, Shan Chen, baofeng Su

Abstract


Abstract: To locate and identify weeds in a wheat field efficiently, an unmanned aerial vehicle (UAV) based imaging method was developed in this study.  A weed detection model based on image data through deep learning was developed and implemented.  The model uses the YOLOV3-tiny network to detect the pixel coordinates of weeds in images.  It acquires the position of weeds by converting the pixel coordinates to the geodetic coordinates.  The identified weeds were marked on the prescription map.  The algorithm was implemented and tested using a commercial DJI Phantom 3 UAV.  This study tested the performance of YOLOV3 and YOLOV3-tiny and found that YOLOV3-tiny was more suitable for mobile devices.  The performance of YOLOV3-tiny at different thresholds was tested.  The test results show that the model performs optimally when the threshold of the YOLOV3-tiny network is 0.5, under this condition, the mean Average Precision (mAP) is 72.5%, the Intersection-over-Union (IOU) is 80.12%, and the mobile device processing speed is 2FPS.  After testing and analyzing weed positioning, results show the average positioning error is10.31 cm, which is extremely small in agricultural operations.  The UAV-based weed position detection system can locate and identify weeds in the crop field at a high speed, efficiently and effectively.

Keywords: UAV, deep learning, weed location, weed recognition, imaging method, target detection, Android APP

DOI: 10.33440/j.ijpaa.20200301.63

 

Citation: Zhang R F, Wang C, Hu X P, Liu Y X, Chen S, Su B F. Weed location and recognition based on UAV imaging and deep learning. Int J Precis Agric Aviat, 2020; 3(1): 23–29.


Full Text:

PDF

References


Brown R, Steckler J-P, Anderson G. Remote sensing for identification of weeds in no-till corn. Transactions of the ASAE, 1994; 37(1): 297–302. doi: 10.13031/2013.28084.

Gerhards R, Sökefeld M, Timmermann C, et al. Site-specific weed control in maize, sugar beet, winter wheat, and winter barley. Precision Agriculture, 2002; 3(1): 25–35. doi: 10.1023/A:1013370019448.

Vrindts E, De Baerdemaeker J, Ramon H. Weed detection using canopy reflection. Precision Agriculture, 2002; 3(1): 63–80. doi: 10.1023/A:1013326304427.

Slaughter D, Giles D K, Downey D. Autonomous robotic weed control systems: A review. Computers and Electronics in Agriculture, 2008; 61(1): 63–78. doi:10.1016/j.compag.2007.05.008.

Lópezâ€Granados F. Weed detection for siteâ€specific weed management: mapping and realâ€time approaches. Weed Research, 2011; 51(1): 1–11. doi: 10.1111/j.1365-3180.2010.00829.x.

Steen K A, Christiansen P, Karstoft H, et al. Using deep learning to challenge safety standard for highly autonomous machines in agriculture. Journal of Imaging, 2016; 2(1): 6. doi: 10.3390/jimaging2010006.

Kamilaris A, Prenafeta-Boldú F X. Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 2018; 147:70–90. doi: 10.1016/j.compag.2018.02.016.

Carrio A, Sampedro C, Rodriguez-Ramos A, et al. A review of deep learning methods and applications for unmanned aerial vehicles. Journal of Sensors, 2017; 2017: 1–13. doi: 10.1155/2017/3296874.

Milioto A, Lottes P, Stachniss C. Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs.2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018: 2229–2235. doi:10.1109/ICRA.2018.8460962.

Potena C, Nardi D, Pretto A. Fast and accurate crop and weed identification with summarized train sets for precision agriculture. Advances in Intelligent Systems and Computing, 2017;105–121. doi: 10.1007/978-3-319-48036-7_9.

Howard A G, Zhu M, Chen B, et al. Mobilenets: Efficient convolutional neural networks for mobile vision applications. ArXiv, 2017; Available:http://arxiv.org/abs/1704.04861 Accessed on [2020-03-11]

Zhang X, Zhou X, Lin M, et al. Shufflenet: An extremely efficient convolutional neural network for mobile devices; proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, F, 2018 ; 6848–6856. doi: 10.1109/CVPR.2018.00716.

Wang R J, Li X, Ling C X. Pelee: A real-time object detection system on mobile device. Advances in Neural Information Processing Systems, 2018; 1963–1972.

Redmon J , Divvala S , Girshick R , et al. You Only Look Once: Unified, Real-Time Object Detection. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2016. doi: 10.1109/CVPR.2016.91.

Redmon J, Farhadi A. YOLO9000: better, faster, stronger. Proceedings of the IEEE conference on computer vision and pattern recognition, 2017; 7263–7271. doi:10.1109/cvpr.2017.690.

Redmon J, Farhadi A. Yolov3: An incremental improvement. arXiv preprint arXiv:180402767, 2018;

Redmon J. Darknet: Open source neural networks in c. http://pjreddiecom/darknet/. Accessed on [2020-03-11].

Juan D. Understanding of Object Detection Based on CNN Family and YOLO. Journal of Physics: Conference Series, 2018; 1004(1): 012029. doi: 10.1088/1742-6596/1004/1/012029.

Sheikh Y, Khan S, Shah M, et al. Geodetic alignment of aerial video frames. Video Registration. Springer, 2003; 144–79. doi: 10.1007/ 978-1-4615-0459-7_7.

Wiegand T, Sullivan G J, Bjontegaard G, et al. Overview of the H. 264/AVC video coding standard. IEEE Transactions on Circuits and Systems for Video Technology, 2003; 13(7): 560–76. doi: 10.1109/ tcsvt.2003.815165.

Barrero O, Perdomo S A. RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precision Agriculture, 2018; 19(5): 809–22. doi: 10.1007/s11119-017-9558-x.


Refbacks

  • There are currently no refbacks.