Monitoring Bambara Groundnut Canopy State Variables at Various Growth Stages Using Low-Cost Remote Sensing Technology and Machine Learning Techniques

Article Preview

Abstract:

The aim of this study was to assess the efficacy of an unmanned aerial vehicle-based remote sensing system for quantifying Bambara groundnut canopy state variables. Remotely sensed color infrared images and in-situ canopy state variables were collected during Malaysia's 2018/19 Bambara growing season at vegetative, flowering, podding, podfilling, maturity, and senescence stages. Five common vegetation indices (VIs) were derived from the images, yielding to single stage and cumulative VIs (∑VIs). The relationship between canopy state variables and single stage VIs/∑VIs was investigated using Pearson’s correlation. Linear parametric and non-linear non-parametric machine learning (ML) regressions were employed to estimate canopy state variables by using VIs/ ∑VIs as input features. The best correlation were observed at flowering stage. The ∑VIs from vegetative to senescence stage exhibited the most robust relationship with canopy state variables. CatBoostRegressor (CBR) excelled in training for all canopy state variables, however, it showed potential overfitting in testing. In contrast, Huber regression (HR) models provided consistent results in both training and testing. HR performance was comparable to that of the top-performing ML algorithms in estimation of groundnut crop variables.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

63-69

Citation:

Online since:

March 2024

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2024 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Hu J, Yue J, Xu X, Han S, Sun T, Liu Y, et al. UAV-Based Remote Sensing for Soybean FVC, LCC, and Maturity Monitoring. Agric 2023, Vol 13, Page 692 [Internet]. 2023 Mar 16 [cited 2023 Sep 11];13(3):692. Available from: https://www.mdpi.com/2077-0472/13/3/692/htm

DOI: 10.3390/agriculture13030692

Google Scholar

[2] dos Santos RA, Mantovani EC, Filgueiras R, Fernandes-Filho EI, da Silva ACB, Venancio LP. Actual Evapotranspiration and Biomass of Maize from a Red–Green-Near-Infrared (RGNIR) Sensor on Board an Unmanned Aerial Vehicle (UAV). Water 2020, Vol 12, Page 2359 [Internet]. 2020 Aug 22 [cited 2023 Sep 11];12(9):2359. Available from: https://www.mdpi.com/2073-4441/12/9/2359/htm

DOI: 10.3390/w12092359

Google Scholar

[3] Ma Y, Jiang Q, Wu X, Zhu R, Gong Y, Peng Y, et al. Feasibility of Combining Deep Learning and RGB Images Obtained by Unmanned Aerial Vehicle for Leaf Area Index Estimation in Rice. Remote Sens 2021, Vol 13, Page 84 [Internet]. 2020 Dec 29 [cited 2023 Sep 28];13(1):84. Available from: https://www.mdpi.com/2072-4292/13/1/84/htm

DOI: 10.3390/rs13010084

Google Scholar

[4] Jewan SYY, Pagay V, Billa L, Tyerman SD, Gautam D, Sparkes D, et al. The feasibility of using a low-cost near-infrared, sensitive, consumer-grade digital camera mounted on a commercial UAV to assess Bambara groundnut yield. Int J Remote Sens [Internet]. 2022 [cited 2023 Sep 11]; 43(2): 393–423. Available from: https://www.tandfonline.com/doi/abs/

DOI: 10.1080/01431161.2021.1974116

Google Scholar

[5] Serrano L, Filella I, Peñuelas J. Remote Sensing of Biomass and Yield of Winter Wheat under Different Nitrogen Supplies. Crop Sci [Internet]. 2000 May 1 [cited 2023 Sep 11];40(3):723–31. Available from: https://onlinelibrary.wiley.com/doi/full/

DOI: 10.2135/cropsci2000.403723x

Google Scholar

[6] Dong T, Meng J, Shang J, Liu J, Wu B, Huffman T. Modified vegetation indices for estimating crop fraction of absorbed photosynthetically active radiation. http://dx.doi.org/101080/0143116120151042122 [Internet]. 2015 Jun 18 [cited 2023 Sep 12]; 36 (12): 3097–113. Available from: https://www.tandfonline.com/doi/abs/10.1080/ 01431161.2015.1042122

DOI: 10.1080/01431161.2015.1042122

Google Scholar

[7] Hassan MA, Yang M, Rasheed A, Jin X, Xia X, Xiao Y, et al. Time-Series Multispectral Indices from Unmanned Aerial Vehicle Imagery Reveal Senescence Rate in Bread Wheat. Remote Sens 2018, Vol 10, Page 809 [Internet]. 2018 May 23 [cited 2023 Sep 12];10(6):809. Available from: https://www.mdpi.com/2072-4292/10/6/809/htm

DOI: 10.3390/rs10060809

Google Scholar

[8] Zaman-Allah M, Vergara O, Araus JL, Tarekegne A, Magorokosho C, Zarco-Tejada PJ, et al. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods [Internet]. 2015 Jul 1 [cited 2023 Sep 12];11(1):1–10. Available from: https://plantmethods.biomedcentral.com/articles/

DOI: 10.1186/s13007-015-0078-2

Google Scholar

[9] Su X, Wang J, Ding L, Lu J, Zhang J, Yao X, et al. Grain yield prediction using multi-temporal UAV-based multispectral vegetation indices and endmember abundance in rice. F Crop Res. 2023 Aug 1;299:108992.

DOI: 10.1016/j.fcr.2023.108992

Google Scholar

[10] Maimaitiyiming M, Sagan V, Sidike P, Kwasniewski MT. Dual activation function-based Extreme Learning Machine (ELM) for estimating grapevine berry yield and quality. Remote Sens [Internet]. 2019 Apr 1 [cited 2021 Mar 10];11(7):740. Available from: www.mdpi.com/journal/remotesensing

DOI: 10.3390/rs11070740

Google Scholar

[11] Zhang Y, Ma J, Liang S, Li X, Li M. An Evaluation of Eight Machine Learning Regression Algorithms for Forest Aboveground Biomass Estimation from Multiple Satellite Data Products. Remote Sens 2020, Vol 12, Page 4015 [Internet]. 2020 Dec 8 [cited 2023 Sep 13]; 12 (24): 4015. Available from: https://www.mdpi.com/2072-4292/12/24/4015/htm

DOI: 10.3390/rs12244015

Google Scholar

[12] Wang W, Cheng Y, Ren Y, Zhang Z, Geng H. Prediction of Chlorophyll Content in Multi-Temporal Winter Wheat Based on Multispectral and Machine Learning. Front Plant Sci [Internet]. 2022 May 27 [cited 2023 Sep 13];13. Available from: https://pubmed.ncbi.nlm.nih.gov/35712585/

DOI: 10.3389/fpls.2022.896408

Google Scholar