Back to Search
Start Over
Prediction of cotton yield based on soil texture, weather conditions and UAV imagery using deep learning.
- Source :
- Precision Agriculture; Feb2024, Vol. 25 Issue 1, p303-326, 24p
- Publication Year :
- 2024
-
Abstract
- Crop yield prediction is important for farmers to conduct proper field management and make marketing decisions. Yield prediction models built on single-type and same-year data may not reflect the holistic effect of environment and management on crop development. This study aimed to quantify cotton yield variation due to soil texture and weather conditions using the multiple-year unmanned aerial vehicle (UAV) imagery and deep learning techniques. UAV images were collected about once a month to quantify cotton growth at two irrigated cotton fields in three years (2017–2019). Soil apparent electrical conductivity (EC<subscript>a</subscript>) of the fields was measured and calibrated to quantify soil texture and soil water holding capacity using eleven soil features. Convolutional neural networks (CNNs) were used to process and analyse the soil features in seven different depths. Similarly, a second CNN was used to analyse the six weather parameters derived from the historical data of a nearby weather station. A gated recurrent unit (GRU) network was used to predict cotton yield using the processed soil data, weather data, and UAV-based image features (e.g., NDVI) of different months. Results show that the GRU model trained with data in two years could predict the cotton yield in a third year with prediction errors of mean average error from 247 (8.9%) to 384 kg ha<superscript>−1</superscript> (13.7%). The study indicates that the developed CNN and GRU networks have the potential to predict crop yield of years other than the ones used for training the models. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13852256
- Volume :
- 25
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Precision Agriculture
- Publication Type :
- Academic Journal
- Accession number :
- 174712314
- Full Text :
- https://doi.org/10.1007/s11119-023-10069-x