9 results on '"Yue Jibo"'
Search Results
2. Classification of Maize Growth Stages Based on Phenotypic Traits and UAV Remote Sensing.
- Author
-
Yao, Yihan, Yue, Jibo, Liu, Yang, Yang, Hao, Feng, Haikuan, Shen, Jianing, Hu, Jingyu, and Liu, Qian
- Subjects
MACHINE learning ,LEAF area index ,KRIGING ,CORN breeding ,CROP growth ,CORN - Abstract
Maize, an important cereal crop and crucial industrial material, is widely used in various fields, including food, feed, and industry. Maize is also a highly adaptable crop, capable of thriving under various climatic and soil conditions. Against the backdrop of intensified climate change, studying the classification of maize growth stages can aid in adjusting planting strategies to enhance yield and quality. Accurate classification of the growth stages of maize breeding materials is important for enhancing yield and quality in breeding endeavors. Traditional remote sensing-based crop growth stage classifications mainly rely on time series vegetation index (VI) analyses; however, VIs are prone to saturation under high-coverage conditions. Maize phenotypic traits at different growth stages may improve the accuracy of crop growth stage classifications. Therefore, we developed a method for classifying maize growth stages during the vegetative growth phase by combining maize phenotypic traits with different classification algorithms. First, we tested various VIs, texture features (TFs), and combinations of VI and TF as input features to estimate the leaf chlorophyll content (LCC), leaf area index (LAI), and fractional vegetation cover (FVC). We determined the optimal feature inputs and estimation methods and completed crop height (CH) extraction. Then, we tested different combinations of maize phenotypic traits as input variables to determine their accuracy in classifying growth stages and to identify the optimal combination and classification method. Finally, we compared the proposed method with traditional growth stage classification methods based on remote sensing VIs and machine learning models. The results indicate that (1) when the VI+TFs are used as input features, random forest regression (RFR) shows a good estimation performance for the LCC (R
2 : 0.920, RMSE: 3.655 SPAD units, MAE: 2.698 SPAD units), Gaussian process regression (GPR) performs well for the LAI (R2 : 0.621, RMSE: 0.494, MAE: 0.397), and linear regression (LR) exhibits a good estimation performance for the FVC (R2 : 0.777, RMSE: 0.051, MAE: 0.040); (2) when using the maize LCC, LAI, FVC, and CH phenotypic traits to classify maize growth stages, the random forest (RF) classification method achieved the highest accuracy (accuracy: 0.951, precision: 0.951, recall: 0.951, F1: 0.951); and (3) the effectiveness of the growth stage classification based on maize phenotypic traits outperforms that of traditional remote sensing-based crop growth stage classifications. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
3. Method for accurate multi-growth-stage estimation of fractional vegetation cover using unmanned aerial vehicle remote sensing
- Author
-
Yue, Jibo, Guo, Wei, Yang, Guijun, Zhou, Chengquan, Feng, Haikuan, and Qiao, Hongbo
- Published
- 2021
- Full Text
- View/download PDF
4. Pretrained Deep Learning Networks and Multispectral Imagery Enhance Maize LCC, FVC, and Maturity Estimation.
- Author
-
Hu, Jingyu, Feng, Hao, Wang, Qilei, Shen, Jianing, Wang, Jian, Liu, Yang, Feng, Haikuan, Yang, Hao, Guo, Wei, Qiao, Hongbo, Niu, Qinglin, and Yue, Jibo
- Subjects
DEEP learning ,STANDARD deviations ,CORN ,CROP canopies ,AGRICULTURAL productivity - Abstract
Crop leaf chlorophyll content (LCC) and fractional vegetation cover (FVC) are crucial indicators for assessing crop health, growth development, and maturity. In contrast to the traditional manual collection of crop trait parameters, unmanned aerial vehicle (UAV) technology rapidly generates LCC and FVC maps for breeding materials, facilitating prompt assessments of maturity information. This study addresses the following research questions: (1) Can image features based on pretrained deep learning networks and ensemble learning enhance the estimation of remote sensing LCC and FVC? (2) Can the proposed adaptive normal maturity detection (ANMD) algorithm effectively monitor maize maturity based on LCC and FVC maps? We conducted the following tasks: (1) Seven phases (tassel initiation to maturity) of maize canopy orthoimages and corresponding ground-truth data for LCC and six phases of FVC using UAVs were collected. (2) Three features, namely vegetation indices (VI), texture features (TF) based on Gray Level Co-occurrence Matrix, and deep features (DF), were evaluated for LCC and FVC estimation. Moreover, the potential of four single-machine learning models and three ensemble models for LCC and FVC estimation was evaluated. (3) The estimated LCC and FVC were combined with the proposed ANMD to monitor maize maturity. The research findings indicate that (1) image features extracted from pretrained deep learning networks more accurately describe crop canopy structure information, effectively eliminating saturation effects and enhancing LCC and FVC estimation accuracy. (2) Ensemble models outperform single-machine learning models in estimating LCC and FVC, providing greater precision. Remarkably, the stacking + DF strategy achieved optimal performance in estimating LCC (coefficient of determination (R
2 ): 0.930; root mean square error (RMSE): 3.974; average absolute error (MAE): 3.096); and FVC (R2 : 0.716; RMSE: 0.057; and MAE: 0.044). (3) The proposed ANMD algorithm combined with LCC and FVC maps can be used to effectively monitor maize maturity. Establishing the maturity threshold for LCC based on the wax ripening period (P5) and successfully applying it to the wax ripening-mature period (P5–P7) achieved high monitoring accuracy (overall accuracy (OA): 0.9625–0.9875; user's accuracy: 0.9583–0.9933; and producer's accuracy: 0.9634–1). Similarly, utilizing the ANMD algorithm with FVC also attained elevated monitoring accuracy during P5–P7 (OA: 0.9125–0.9750; UA: 0.878–0.9778; and PA: 0.9362–0.9934). This study offers robust insights for future agricultural production and breeding, offering valuable insights for the further exploration of crop monitoring technologies and methodologies. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
5. Comparison of Different Dimensional Spectral Indices for Estimating Nitrogen Content of Potato Plants over Multiple Growth Periods.
- Author
-
Fan, Yiguang, Feng, Haikuan, Yue, Jibo, Liu, Yang, Jin, Xiuliang, Xu, Xingang, Song, Xiaoyu, Ma, Yanpeng, and Yang, Guijun
- Subjects
POTATOES ,NITROGEN content of plants ,STANDARD deviations ,DRONE aircraft - Abstract
The estimation of physicochemical crop parameters based on spectral indices depend strongly on planting year, cultivar, and growing period. Therefore, the efficient monitoring of crop growth and nitrogen (N) fertilizer treatment requires that we develop a generic spectral index that allows the rapid assessment of the plant nitrogen content (PNC) of crops and that is independent of year, cultivar, and growing period. Thus, to obtain the best indicator for estimating potato PNC, herein, we provide an in-depth comparative analysis of the use of hyperspectral single-band reflectance and two- and three-band spectral indices of arbitrary bands for estimating potato PNC over several years and for different cultivars and growth periods. Potato field trials under different N treatments were conducted over the years 2018 and 2019. An unmanned aerial vehicle hyperspectral remote sensing platform was used to acquire canopy reflectance data at several key potato growth periods, and six spectral transformation techniques and 12 arbitrary band combinations were constructed. From these, optimal single-, two-, and three-dimensional spectral indices were selected. Finally, each optimal spectral index was used to estimate potato PNC under different scenarios and the results were systematically evaluated based on a correlation analysis and univariate linear modeling. The results show that, although the spectral transformation technique strengthens the correlation between spectral information and potato PNC, the PNC estimation model constructed based on single-band reflectance is of limited accuracy and stability. In contrast, the optimal three-band spectral index TBI 5 (530,734,514) performs optimally, with coefficients of determination of 0.67 and 0.65, root mean square errors of 0.39 and 0.39, and normalized root mean square errors of 12.64% and 12.17% for the calibration and validation datasets, respectively. The results thus provide a reference for the rapid and efficient monitoring of PNC in large potato fields. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Monitoring of Soybean Maturity Using UAV Remote Sensing and Deep Learning.
- Author
-
Zhang, Shanxin, Feng, Hao, Han, Shaoyu, Shi, Zhengkai, Xu, Haoran, Liu, Yang, Feng, Haikuan, Zhou, Chengquan, and Yue, Jibo
- Subjects
DEEP learning ,REMOTE sensing ,CONVOLUTIONAL neural networks ,MACHINE learning ,SUPPORT vector machines ,SOYBEAN - Abstract
Soybean breeders must develop early-maturing, standard, and late-maturing varieties for planting at different latitudes to ensure that soybean plants fully utilize solar radiation. Therefore, timely monitoring of soybean breeding line maturity is crucial for soybean harvesting management and yield measurement. Currently, the widely used deep learning models focus more on extracting deep image features, whereas shallow image feature information is ignored. In this study, we designed a new convolutional neural network (CNN) architecture, called DS-SoybeanNet, to improve the performance of unmanned aerial vehicle (UAV)-based soybean maturity information monitoring. DS-SoybeanNet can extract and utilize both shallow and deep image features. We used a high-definition digital camera on board a UAV to collect high-definition soybean canopy digital images. A total of 2662 soybean canopy digital images were obtained from two soybean breeding fields (fields F1 and F2). We compared the soybean maturity classification accuracies of (i) conventional machine learning methods (support vector machine (SVM) and random forest (RF)), (ii) current deep learning methods (InceptionResNetV2, MobileNetV2, and ResNet50), and (iii) our proposed DS-SoybeanNet method. Our results show the following: (1) The conventional machine learning methods (SVM and RF) had faster calculation times than the deep learning methods (InceptionResNetV2, MobileNetV2, and ResNet50) and our proposed DS-SoybeanNet method. For example, the computation speed of RF was 0.03 s per 1000 images. However, the conventional machine learning methods had lower overall accuracies (field F2: 63.37–65.38%) than the proposed DS-SoybeanNet (Field F2: 86.26%). (2) The performances of the current deep learning and conventional machine learning methods notably decreased when tested on a new dataset. For example, the overall accuracies of MobileNetV2 for fields F1 and F2 were 97.52% and 52.75%, respectively. (3) The proposed DS-SoybeanNet model can provide high-performance soybean maturity classification results. It showed a computation speed of 11.770 s per 1000 images and overall accuracies for fields F1 and F2 of 99.19% and 86.26%, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Analyzing winter-wheat biochemical traits using hyperspectral remote sensing and deep learning.
- Author
-
Yue, Jibo, Yang, Guijun, Li, Changchun, Liu, Yang, Wang, Jian, Guo, Wei, Ma, Xinming, Niu, Qinglin, Qiao, Hongbo, and Feng, Haikuan
- Subjects
- *
WINTER wheat , *DEEP learning , *CONVOLUTIONAL neural networks , *LEAF area index , *REMOTE sensing , *SPECTRAL reflectance - Abstract
• A novel model (LabTNet) is proposed to estimate crop biochemical traits. • LabTNet jointly estimates multiple wheat traits by hyperspectral data. • LabTNet exhibits superior predictive performance than existing methods. • Grad-CAM analyzed hyperspectral attention regions of LabTNet. Accurate estimation of crop leaf and canopy biochemical traits, such as leaf dry matter content (Cm), leaf equivalent water thickness (Cw), leaf area index (LAI), dry leaf biomass (DLB), leaf total water content (LW), and fresh leaf biomass (FLB), is essential for monitoring crop growth accurately. The vegetation spectral feature technique combined with statistical regression methods is widely employed for remote sensing crop biochemical traits mapping. However, the crop canopy spectral reflectance is influenced by various crop biochemical traits and uncertainties in geometric changes of light and soil background effects. Consequently, the remote-sensing estimation of crop biochemical traits is limited. A potential solution involves training a deep learning model to understand the physical relationship between crop biochemical traits and canopy spectral reflectance based on a physical radiative transfer model (RTM). The primary focus of this study is to propose a winter-wheat leaf and canopy biochemical traits analysis and mapping method based on hyperspectral remote sensing, utilizing a deep learning network for leaf area index and leaf biochemical traits deep learning network (LabTNet). This study consists of four main tasks: (1) Field-based measurements of winter-wheat spectra and biochemical traits were conducted in two growing seasons. A PROSAIL RTM was also employed to generate a simulated dataset representing comprehensive and complex winter-wheat field conditions. (2) The LabTNet deep learning model was pre-trained using the simulated spectra dataset to acquire knowledge of the physical relationship between crop biochemical traits and canopy spectral reflectance derived from the RTM. Subsequently, the model was re-trained using the field-based spectra dataset from two growing seasons, employing a transfer learning technique. (3) An analysis was conducted to assess the performance of LabTNet against traditional statistical regression methods in estimating crop leaf and canopy biochemical traits. The study used the gradient-weighted class activation mapping (Grad-CAM) technique to analyze the attention regions of input spectra (454:8:950 nm, 960:10:1300 nm, 1450:10:1750 nm, 2000:10:2350 nm) by different convolutional neural network layers in LabTNet, aiming to enhance the interpretability of deep learning models. (4) Winter-wheat leaf and canopy biochemical traits (Cw, Cm, LAI, DLB, LW, and FLB) were mapped using the LabTNet deep learning model. Our research has the following conclusions: (1) Combining the RTM and deep learning techniques yields higher winter-wheat biochemical trait estimates than traditional statistical regression methods. (2) Different LabTNet deep learning model layers focus on distinct areas of canopy reflectance, corresponding to the sensitive regions for various winter-wheat biochemical traits. (3) LabTNet demonstrates similar winter-wheat leaf and canopy biochemical traits estimation performance using visible and near-infrared (VNIR) reflectance data and full-spectral (FS) range hyperspectral reflectance as inputs (Cw: R 2 = 0.603–0.653, RMSE = 0.0015–0.0015 cm; Cm: R 2 = 0.511–0.560, RMSE = 0.0006–0.0007 g/m2; LAI: R 2 = 0.773–0.793, RMSE = 0.65–0.66 m2/m2; LW: R 2 = 0.842–0.847, RMSE = 67.93–70.73 g/m2; DLB: R 2 = 0.747–0.762, RMSE = 21.10–21.89 g/m2; FLB: R 2 = 0.831–0.840, RMSE = 86.26–90.30 g/m2). The combined use of UAV hyperspectral remote sensing and the LabTNet model proves effective in providing high-performance winter-wheat leaf and canopy biochemical trait maps, offering valuable insights for agricultural management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices.
- Author
-
Yue, Jibo, Yang, Guijun, Tian, Qingjiu, Feng, Haikuan, Xu, Kaijian, and Zhou, Chengquan
- Subjects
- *
HYPERSPECTRAL imaging systems , *ESTIMATES , *STANDARD deviations - Abstract
Abstract When dealing with multiple growth stages, estimates of above-ground biomass (AGB) based on optical vegetation indices (VIs) are difficult for two reasons: (i) optical VIs saturate at medium-to-high canopy cover, and (ii) organs that grow vertically (e.g., biomass of reproductive organs and stems) are difficult to detect by canopy spectral VIs. Although several significant improvements have been made for estimating AGB by using narrow-band hyperspectral VIs, synthetic aperture radar, laser intensity direction and ranging, the crop surface model technique, and combinations thereof, applications of these new techniques have been limited by cost, availability, data-processing difficulties, and high dimensionality. The present study thus evaluates the use of ultrahigh-ground-resolution image textures, VIs, and combinations thereof to make multiple temporal estimates and maps of AGB covering three winter-wheat growth stages. The selected gray-tone spatial-dependence matrix-based image textures (e.g., variance, entropy, data range, homogeneity, second moment, dissimilarity, contrast, correlation) are calculated from 1-, 2-, 5-, 10-, 15-, 20-, 25-, and 30-cm-ground-resolution images acquired by using an inexpensive RGB sensor mounted on an unmanned aerial vehicle (UAV). Optical-VI data were obtained by using a ground spectrometer to analyze UAV-acquired RGB images. The accuracy of AGB estimates based on optical VIs varies, with validation R 2: 0.59–0.78, root mean square error (RMSE): 1.22–1.59 t/ha, and mean absolute error (MAE): 1.03–1.27 t/ha. The most accurate AGB estimate was obtained by combining image textures and VIs, which gave R 2: 0.89, MAE: 0.67 t/ha, and RMSE: 0.82 t/ha. The results show that (i) the eight selected textures from ultrahigh-ground-resolution images were significantly related to AGB, (ii) the combined use of image textures from 1- to 30-cm-ground-resolution images and VIs can improve the accuracy of AGB estimates as compared with using only optical VIs or image textures alone; and (iii) high AGB values from winter-wheat reproductive growth stages can be accurately estimated by using this method; (iv) high estimates of winter-wheat AGB (8–14 t/ha) using the proposed combined method (DIS1, SE30, B460, B560, B670, EVI2 using MSR) show a 22.63% (nRMSE) improvement compared with using only spectral VIs (LCI, NDVI using MSR), and a 21.24% (nRMSE) improvement compared with using only image textures (COR1, DIS1, SE30, EN30 using MSR). Thus, the combined use of image textures and VIs can help improve estimates of AGB under conditions of high canopy coverage. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
9. Using an optimized texture index to monitor the nitrogen content of potato plants over multiple growth stages.
- Author
-
Fan, Yiguang, Feng, Haikuan, Yue, Jibo, Jin, Xiuliang, Liu, Yang, Chen, Riqiang, Bian, Mingbo, Ma, Yanpeng, Song, Xiaoyu, and Yang, Guijun
- Subjects
- *
POTATOES , *NITROGEN content of plants , *FOOD texture , *NITROGEN fertilizers , *STANDARD deviations , *OPTIMIZATION algorithms - Abstract
• Our method focuses on exploring the effect of texture indices to estimate potato PNC. • An texture index was developed for PNC estimation throughout the potato growth cycle. • O-TTI3 can further improve the estimation accuracy of potato PNC. Plant nitrogen content (PNC) is vital for evaluating crop nitrogen nutrient status and for net primary productivity. Therefore, rapid and accurate acquisition of crop PNC information can help reduce fertilizer and increase efficiency in modern agriculture. This study investigated whether an optimized texture index constructed from hyperspectral characteristic wavelength images acquired from a unmanned aerial vehicle (UAV) may be used to estimate PNC in potatoes over multiple growth stages. A potato field trial conducted in 2019 in Beijing, China, included different planting densities, nitrogen fertilizer gradients, and potato varieties with three replicates. A UAV remote sensing platform served to acquire hyperspectral images during three critical N demand periods for potatoes. In addition, we simultaneously conducted field sampling to obtain ground-truth PNC measurements. Following the classical form of vegetation indices, 12 texture indices were constructed based on hyperspectral texture features, and an arbitrary variable combination optimization algorithm was used to optimize these indices. Finally, the texture index, which had the highest correlation with potato PNC, was used as the best indicator for estimating the PNC status of potato at multiple growth stages and whether this indicator, in combination with spectral information, could further improve the accuracy of potato PNC estimation was subsequently explored. The results showed that (i) the optimal texture index TTI3 (R 494 -Cor, R 578 -Hom, R 514 -Sem) maintained a good linear relationship with PNC at the late stage of potato growth, and the accuracy and stability of the PNC estimation models constructed based on it was significantly better than that of a single texture metric. (ii) Compared with spectral information alone, the texture index combined with spectral features improved the accuracy of potato PNC estimation. More specifically, TTI3 (R 494 -Cor, R 578 -Hom, R 514 -Sem) combined with the three-band spectral index TBI 5 (530, 734, 514) achieved the best estimation accuracy with calibrated R2 , root mean square error (RMSE), and normalized RMSE of 0.77%, 0.28%, and 9.88%, respectively. The results of this study showed that the texture index constructed by combining multiple texture metrics enhanced the association between texture features and potato PNC over multiple growth stages, thus improving the monitoring accuracy of potato nitrogen nutrition status. This study can provide a reference for accurately managing crop nitrogen nutrition. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.