Tytuł artykułu
Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Agriculture is sighted more use cases of drones, and with the expanding population, food yields are becoming more well organized. Drones are used in examining crops and exploiting data to determine what requires greater attention. This research study focuses on how deep learning (DL) has been used with drone technology to create solutions for detecting crop fields within a certain regions of interest (ROI). Extracting images from a drone and analysing them with a DL system to identify crop fields and yields for less-developed nations are solution to a prevalent challenge that land use–land cover (LULC) encounters. The limitations of drone spot-checking in the context of agricultural fields and the constraints of utilizing DL to detect yields. Also, a novel method is offered for detecting and tracking crop fields using a single camera on our UAV. The estimated background movements using a perspective transformation model given a sequence of video frames and then locate distinct locations in the background removed picture to detect moving objects. The optical flow matching is used to determine the spatiotemporal features of each moving item and then categorize our targets, which have considerably different motions than the backdrop. Kalman filter tracking has used to ensure that our detections are consistent across time. The hybrid crop field detection model is to evaluate on real uncrewed aerial vehicle (UAV) recordings. And the findings suggest that hybrid crop field detection successfully detects and tracks crop fields through tiny UAV’s with low computational resources. A crop field module, which aids in reconstruction quality evaluation by cropping specific ROIs from the whole field, and a reversing module, which projects ROIs-Vellore to relative raw pictures, are included in the proposed method. The results exhibit faster identification of cropping and reversing modules, impacting ROI height selection and reverse extraction of ROI location from raw pictures.
Wydawca
Czasopismo
Rocznik
Tom
Strony
2991--3004
Opis fizyczny
Bibliogr. 24 poz.
Twórcy
autor
- School of Information Technology and Engineering, Vellore Institute of Technology, Vellore, Tamil Nadu, India
autor
- School of Information Technology and Engineering, Vellore Institute of Technology, Vellore, Tamil Nadu, India
Bibliografia
- 1. slan MF, Durdu A, Sabanci K, Ropelewska E, Gültekin SS (2022) A comprehensive survey of the recent studies with UAV for precision agriculture in open fields and greenhouses. Appl Sci 12(3):1047
- 2. Bian C, Shi H, Wu S, Zhang K, Wei M, Zhao Y, Chen S (2022) Prediction of field-scale wheat yield using machine learning method and multi-spectral UAV data. Remote Sens 14(6):1474
- 3. Bouguettaya A, Zarzour H, Kechida A, Taberkit AM (2022) Deep learning techniques to classify agricultural crops through UAV imagery: a review. Neural Comput Appl 34:1–26
- 4. Brezani S, Hrasko R, Vanco D, Vojtas J, Vojtas P (2022) Deep learning for knowledge extraction from uav images 1. In: Information modelling and knowledge bases XXXIII (pp 44–63). IOS Press
- 5. Cheng M, Jiao X, Liu Y, Shao M, Yu X, Bai Y, Jin X (2022) Estimation of soil moisture content under high maize canopy coverage from UAV multimodal data and machine learning. Agric Water Manag 264:107530
- 6. Ganeva D, Roumenina E, Dimitrov P, Gikov A, Jelev G, Dragov R, Taneva K (2022) Phenotypic traits estimation and preliminary yield assessment in different phenophases of wheat breeding experiment based on UAV multispectral images. Remote Sens 14(4):1019
- 7. Impollonia G, Croci M, Martani E, Ferrarini A, Kam J, Trindade LM, Amaducci S (2022) Moisture content estimation and senescence phenotyping of novel Miscanthus hybrids combining UAV-based remote sensing and machine learning. GCB Bioenergy 14(6):639–656. https://doi.org/10.1111/gcbb.12930
- 8. Latif MA (2019) Multi-crop recognition using UAV-based high-resolution NDVI time-series. J Uncrewed Vehicle Syst 7(3):207–218
- 9. Lipping T, Linna P, Narra N (2022) New developments and environmental applications of drones. In FinDrones. Springer
- 10. Li KY, Sampaio de Lima R, Burnside NG, Vahtmäe E, Kutser T, Sepp K, Sepp K (2022a) Toward automated machine learning-based hyperspectral image analysis in crop yield and biomass estimation. Remote Sens 14(5):1114
- 11. Li Z, Chen Z, Cheng Q, Duan F, Sui R, Huang X, Xu H (2022b) UAV-based hyperspectral and ensemble machine learning for predicting yield in winter wheat. Agronomy 12(1):202
- 12. Li F, Bai J, Zhang M, Zhang R (2022c) Yield estimation of high-density cotton fields using low-altitude UAV imaging and deep learning. Plant Methods 18(1):1–11
- 13. Marshall M, Belgiu M, Boschetti M, Pepe M, Stein A, Nelson A (2022) Field-level crop yield estimation with PRISMA and Sentinel-2. ISPRS J Photogramm Remote Sens 187:191–210
- 14. Maimaitijiang M, Sagan V, Sidike P, Daloye AM, Erkbol H, Fritschi FB (2020) Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens 12(9):1357
- 15. Muruganantham P, Wibowo S, Grandhi S, Samrat NH, Islam N (2022) A systematic literature review on crop yield prediction with deep learning and remote sensing. Remote Sens 14(9):1990
- 16. Oikonomidis A, Catal C, Kassahun A (2022) Hybrid deep learning-based models for crop yield prediction. Appl Artif Intell 36:1–18
- 17. Safarijalal B, Alborzi Y, Najafi E (2022) Automated wheat disease detection using a ROS-based autonomous guided UAV. https://doi.org/10.21203/rs.3.rs-1251771/v1
- 18. Sharma P, Leigh L, Chang J, Maimaitijiang M, Caffé M (2022) Above-ground biomass estimation in oats using UAV remote sensing and machine learning. Sensors 22(2):601
- 19. Song X, Wu F, Lu X, Yang T, Ju C, Sun C, Liu T (2022) The classification of farming progress in rice-wheat rotation fields based on UAV RGB images and the regional mean model. Agriculture 12(2):124
- 20. Wang Z, Zhao Z, Yin C (2022) Fine crop classification based on UAV hyperspectral images and random forest. ISPRS Int J Geo Inf 11(4):252
- 21. Yang MD, Tseng HH, Hsu YC, Yang CY, Lai MH, Wu DH (2021) A UAV open dataset of rice paddies for deep learning practice. Remote Sens 13(7):1358
- 22. Ye Z, Wei J, Lin Y, Guo Q, Zhang J, Zhang H, Yang K (2022) Extraction of olive crown based on UAV Visible images and the U2-Net deep learning model. Remote Sens 14(6):1523
- 23. Zhang X, Han L, Sobeih T, Lappin L, Lee MA, Howard A, Kisdi A (2022a) The self-supervised spectral-spatial vision transformer network for accurate prediction of wheat nitrogen status from UAV imagery. Remote Sens 14(6):1400
- 24. Zhang Y, Ta N, Guo S, Chen Q, Zhao L, Li F, Chang Q (2022b) Combining spectral and textural information from UAV RGB images for leaf area index monitoring in Kiwifruit Orchard. Remote Sens 14(5):1063
Uwagi
PL
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-b40c69ca-62f9-4daf-8133-79651a942148