Next Article in Journal
FPS: Fast Path Planner Algorithm Based on Sparse Visibility Graph and Bidirectional Breadth-First Search
Previous Article in Journal
Monitoring Land Subsidence Using PS-InSAR Technique in Rawalpindi and Islamabad, Pakistan
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Monitoring Key Wheat Growth Variables by Integrating Phenology and UAV Multispectral Imagery Data into Random Forest Model

1
College of Agronomy, Henan Agricultural University, Zhengzhou 450046, China
2
Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
3
College of Geodesy and Geomatics, Shandong University of Science and Technology, Qingdao 266590, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(15), 3723; https://doi.org/10.3390/rs14153723
Submission received: 30 June 2022 / Revised: 26 July 2022 / Accepted: 29 July 2022 / Published: 3 August 2022

Abstract

:
Rapidly developing remote sensing techniques are shedding new light on large-scale crop growth status monitoring, especially in recent applications of unmanned aerial vehicles (UAVs). Many inversion models have been built to estimate crop growth variables. However, the present methods focused on building models for each single crop stage, and the features generally used in the models are vegetation indices (VI) or joint VI with data derived from UAV-based sensors (e.g., texture, RGB color information, or canopy height). It is obvious these models are either limited to a single stage or have an unstable performance across stages. To address these issues, this study selected four key wheat growth parameters for inversion: above-ground biomass (AGB), plant nitrogen accumulation (PNA) and concentration (PNC), and the nitrogen nutrition index (NNI). Crop data and multispectral data were acquired in five wheat growth stages. Then, the band reflectance and VI were obtained from multispectral data, along with the five stages that were recorded as phenology indicators (PIs) according to the stage of Zadok’s scale. These three types of data formed six combinations (C1–C6): C1 used all of the band reflectances, C2 used all VIs, C3 used bands and VIs, C4 used bands and PIs, C5 used VIs and PIs, and C6 used bands, Vis, and PIs. Some of the combinations were integrated with PIs to verify if PIs can improve the model accuracy. Random forest (RF) was used to build models with combinations of different parameters and evaluate the feature importance. The results showed that all models of different combinations have good performance in the modeling of crop parameters, such as R2 from 0.6 to 0.79 and NRMSE from 10.51 to 15.83%. Then, the model was optimized to understand the importance of PIs. The results showed that the combinations that integrated PIs showed better estimations and the potential of using PIs to minimize features while still achieving good predictions. Finally, the varied model results were evaluated to analyze their performances in different stages or fertilizer treatments. The results showed the models have good performances at different stages or treatments (R2 > 0.6). This paper provides a reference for monitoring and estimating wheat growth parameters based on UAV multispectral imagery and phenology information.

1. Introduction

Winter wheat is one of the most widely cultivated and fertilized food crops, and it is used in many products for human consumption [1]. Wheat growth monitoring is a crucial part of achieving a reasonable yield, and field management is the following step after analyzing the crop growth situation. Among different kinds of field management, fertilizing has been recognized as the most important. Proper fertilizing is the key strategy to secure optimal crop yield. Nitrogen-based fertilizers provide vast N for crop growth. Nitrogen (N) takes part in multiple metabolisms and structural components, which makes N one of the most important elements in both crop and environmental sciences. It is an essential element in both wheat crop growth and yield formation [2]. While N deficiency makes it difficult to achieve the target yield, overfertilization is a common mistake in the unilateral pursuit of high yield. Excessive N applications lead to delayed maturity, which causes reduced yield, and adverse environmental impacts, such as soil contamination; furthermore, nitrogen is a major contributing source of greenhouse gas (GHG) [3,4]. Diagnosing crop growth status and variable rate fertilization can assist in avoiding the above problems. The principle of precision agriculture is the spatial and temporal variability of fertilizer. Therefore, the determination of crop status is the key procedure in practice [5]. Several parameters have been used for measuring the plant growth condition; for example, plant nitrogen concentration (PNC) and accumulation (PNA) are direct indicators of crop growth. Additionally, above-ground biomass (AGB) is another frequently used indicator because it is the proxy of the final yield. When the parameters are put together, the nitrogen nutrition index (NNI) is established by critical N dilution theory [6,7]. These parameters have been proved to be effective for use in variable rate fertilization; however, the issue is the speed of the process [8].
The timely and accurate monitoring of crop growth status is necessary for modern agricultural management. Traditionally, to acquire the growth variables, field samples are taken for lab analysis, which is time-consuming. Additionally, the results are spatially limiting. Thus, it is important to find a way to achieve more effective results [9]. Remote-sensing technology offers an alternative for assessing crop nutrient status, and crop parameters have been retrieved from remote-sensed data by different approaches [10]. With the emerging unmanned aerial vehicle (UAV) platform, which carries passive or active sensors, it is becoming easier to access rapid and non-destructive spatial results of crop growth parameters [11]. UAVs have advantages of flexibility and versatility; they are operated at relatively low cost while acquiring high spatial and temporal resolution data. In practice, the crop AGB can be estimated by RGB or multispectral images, and other N-related crop parameters can also be estimated by fusing image and spectral information. The reported modeling process includes seeking sensitive bands or VIs, and then using them to build models. Meanwhile, some studies investigated the optimal time window for growth monitoring [12,13,14,15]. These previous experiments have demonstrated the feasibility of UAV application.
Crop growth variables and retrieval methods can be categorized in three ways: empirical, physical, and hybrid methods. Previous research focused on the simple linear or non-linear relationships between vegetation indices (VI) and specific crop parameters [12,16], or used physical-based methods, known as radiative transfer models, to retrieve crop N status [17]. In order to make full use of the abundant UAV data, including band reflectance, VI or texture, and other features, machine learning regression algorithms of various kinds have been introduced for quantitative vegetation remote sensing [18,19,20]. Machine learning methods are becoming powerful modelling tools to interpret information from large amounts of remotely sensed data. Among different regression algorithms, random forest (RF) is a classic and powerful method [21]. It is an ensemble learning model that combines a large number of decision trees, which makes it robust when the model consists of many input variables. RF models have been widely used in crop classification, growth monitoring, and yield forecast [22,23]. Additionally, RF prevails in the previous comparison studies of different algorithms for monitoring different crops [24,25]. These studies have yielded satisfactory results by using RF models, both in classification and prediction, which strongly illustrated the feasibility of the RF model.
Although crop parameters are estimated by different techniques using remotely sensed data, a common problem is that these kinds of models neglect the fact that crops are significantly different in their different growth stages. Crop growth is an allometry process. The morphology traits of a crop can change substantially from the vegetation to reproductive stages, and the leaves, stems, and spikes play different roles in different stages. This would cause significant impacts on remote-sensing observations. In the practice of crop vegetation remote sensing using optical sensors, the leaves, stems, and panicles are the spectrally responsive organs. Leaves are always the major sources of reflection in the different stages [26]; however, as the growth stages progress, leaves will transform from a sink to a source of assimilates. When leaves are sinks, their major function is as the storage warehouse of photosynthesis production, while they turn into a photosynthesis producer when the leaves are mature. At the same time, the stems start to become the major storage of the assimilates, especially after the shift from the vegetation to the reproductive stage, and the panicles become the new sink for the yield formation. During this period, the characters of the leaves, stems, and panicles keep changing, affecting sink and flow interrelation and transformation; consequently, biomass and N deposits will translocate in different sinks accordingly [27], id est, the leaves will provide substantial reflectance while not always being responsible for the majority of crop biomass and nitrogen storage. Since this asymmetric information is hard to be exhibited in spectral information [28], it might explain the deficits of the simple linear model; furthermore, this phenomenon suggests that phenology information should be considered in the crop monitoring models. Several researchers have pointed out that crop phenology is important in predicting crop growth conditions or forecasting yield [29,30].
Current machine learning models for monitoring crop growth status have been established by solely using multiple vegetation indices [12,31]. In addition to this, efforts include fusing color features or combining them with cultivar information [32,33]. Moreover, canopy fluorescence is an increasingly popular technique that can be used for growth monitoring [34]. Furthermore, using spectral-based deep learning is also a rational approach [35]. These studies utilized spectrum information in addition to other vegetation characteristics. Only a few studies considered the variation of phenology and integrated it into the model. Since the crop growth status significantly varied from stage to stage, optical sensors had a limited ability to track this inherent variation. The addition of different types of data could be descriptive for different stages, which clearly showed that the direct application of phenology is appealing in the context of adjusting the model instability in multi-stage scenarios. Therefore, the main objectives of this study are:
(1)
To verify the phenology effect on retrieving wheat crop parameters from UAV multispectral data;
(2)
To use RF models to evaluate the accuracy of different combinations of band reflections, Vis, and PIs in wheat parameters prediction;
(3)
To Specify the accuracy of different growth stages and N treatments in established models to understand the model applicability.

2. Materials and Methods

2.1. Experimental Site and Design

We conducted the experiment during the 2020–2021 winter wheat growing season at the Xiaotangshan National Experiment Station for Precision Agriculture (116°26’36’’ E, 40°10’44’’ N) in Beijing, China (Figure 1). The field site is in the northernmost section of the North China Plain, which has a temperate monsoon semi-humid climate of medium latitudes, with an average altitude of 36 m. It has an average annual precipitation of 500–600 mm and an average annual temperature of 12 °C. The annual amount of solar radiation is 4800 MJ m−2 (China Meteorological Data Service, http://data.cma.cn/ accessed on 6 January 2022). We provide the weather information of the experiment duration in Figure 2.
This study was part of an ongoing long-term fertilizer experiment. We selected two local wheats (Triticum durum L. cultivar. JH11 and cultivar. ZM1062) and four nitrogen fertilizer rates (0, 90, 180, 270 N/ha) in the field experiment. The plot size was 9 × 15 m. We set row spacing to 15 cm, and we uniformly set the plant density to 360 × 104 plants ha−1. We settled treatments in all field experiments using a complete randomized block design with four replicates. We divided the nitrogen fertilizer application into two splits and applied 1:1 for the base and top-dress before sowing and at the jointing stage.
The primary soil type is a clay loam soil by Food and Agriculture Organization (FAO) soil classification, with a pH of 7.7, 19 g · kg−1 organic matter, 1.01 g · kg−1 total N, 14.5 mg · kg−1 Olsen-P, and 127.9 mg · kg−1-available K in the 0–20 cm-surface soil layer. We performed other field management procedures, including weed control, pest management, and our application of phosphate and potassium fertilizer followed local standard practices for winter wheat production.

2.2. Crop Data Acquisition and Calculation of Nitrogen Nutrient Index

We conducted five experiments in five key wheat growth stages (jointing, booting, anthesis, early filling, late filling). There were 24 samples in the jointing stage and 32 in the other stages. In total, the sampling number was 152. During field sampling at each stage, we randomly collected 20 tillers around the white frame of each plot. We immediately took fresh samples to the laboratory and separated them into leaves and stems, and also spikes after they emerged. We put the samples into paper bags and placed them in the oven at 105 °C for 20 min to stop metabolism, and then dried at 80 °C until the samples became constant weight. We recorded the dry weight of each sample by a balance with an accuracy of 0.001 g. After we acquired the AGB, we analyzed all samples for N concentration using the micro-Kjeldahl method. We calculated the total plant N concentration as the ratio of the total N accumulation to AGB.
AGB = ( L W + S W + P W ) · T 20 · L
PNA = ( L W · L N + S W · S N + P W · P N )
PNC = PNA AGB
where LW, SW, and PW are the dry weights of leaf, stem, and panicle samples, respectively. LN, SN, and PN are the N concentrations of leaf, stem, and panicle samples, respectively. T is the number of winter wheat stems per unit area and L is the row spacing (15 cm). Subsequently, PNA is plant nitrogen accumulation and PNC is plant nitrogen content.
As described by Lemaire [36], we calculated the nitrogen nutrition index (NNI) of each treatment within various growth stages by the following equation:
NNI = N a / N c
where the Na represents the actual N concentration, Nc represents the critical N concentration. The Nc curve used in this study was adopted from previous research [37]:
N c = 5.35 · A G B 0.53
We established the equation from a previous nitrogen fertilizer experiment in the same field. Crop nitrogen status is normal when NNI is between 0.95 and 1.05, it overdoses when NNI exceeds 1.05, and there is a deficit when it is less than 0.95.

2.3. Acquisition and Preprocessing of UAV Images

We used a DJI Phantom 4 Multispectral 4-rotor-wing unmanned aerial vehicle (UAV) (DJI-P4M, SZ DJI Technology Co., Ltd., Shenzhen, China) to capture multispectral images. The UAV had 2 million pixel multispectral sensors consisting of 6 cameras, including Blue (450 nm ± 16 nm), Green (560 nm ± 16 nm), Red (650 nm ± 16 nm), RE (730 nm ± 16 nm), NIR (840 nm ± 26 nm), and visible light (RGB). The details of the UAV and sensor information are in Figure 3.
We conducted five flights within each key wheat growth stage. We set the flight mission height to 30 m, with a speed of 4 m/s. We set the image overlap and sidelap to 80%. The ground spatial resolution is 1.6 cm under such parameters. We performed all flight missions between 10:00 and 12:00 on clear and cloudless days. Prior to each flight, we collected calibration images with a standard reflectance panel. The panel is a fine cloth installed inside a plastic box, and it has a basic reflectance of 0.797, 0.872, 0.877, 0.875, and 0.867 for Blue, Green, Red, RE, and NIR, respectively. We calibrated each image to follow the specific band. After the flight, we calibrated our collected multispectral images and processed into ortho-mosaic maps using DJI Terra software (Terra, SZ DJI Technology Co., Ltd., Shenzhen, China) for further analysis.

2.4. Feature Extraction and Determination

We used the ortho-mosaic maps for band reflectance and VIs extraction. We extracted all data within the white frames using shapefiles for each stage. Additionally, we calculated plot average of all pixel values as the extraction results.
We used three types of features in this study, including the original multispectral band reflectance as the first type, vegetation indices for the second type, and phenology indicators (PIs) for the third type. For the first two types of data, because we extracted them from UAV images, in order to keep consistency, we acquired all of them from within the white frame (Figure 1). We collected the original reflectance of each band as the first data type, and then we selected several vegetation indices as second data type, of which we used part of them to model AGB. Additionally, we used part of them to retrieve key growth parameters in the previous studies, which indicated all the selected VIs have reasonable potential to be used for crop nitrogen status monitoring. As for the third type, we recorded the five stages that represent the typical wheat growth process as Zadok growth stages—phenology indicators. In this case, ZS33, ZS47, ZS65, ZS75, and ZS80 represent the wheat jointing, flag leaf, anthesis, early filling, and late filling stages, respectively. In order to maintain data uniformity, we simplified the phenology into a number of stages. The three types of data are listed in Table 1.

2.5. Data Analysis and Model Establishment

We analyzed the 4 crop parameters data by Tukey’s HSD test to distinguish the differences across 5 stages. We performed a three-way analysis of variance (ANOVA) to explore how much the phenology contributes to the biometrics. We analyzed the effects of cultivars, N treatment, and phenology on AGB, PNA, PNC, and NNI. We used Duncan’s test to analyze differences in parameter averages between treatments. The threshold for statistical significance was p < 0.05.
In this study, we employed RF to build models for AGB, PNA, PNC, and NNI. As mentioned above, we grouped all the features into three types of data, and we selected these features as six combinations (Table 2).
RF is an ensemble technique that combines multiple decision trees, and each tree in the forest predicts independently. We put all predictions into a vote to make the final prediction. It is a practicable model when dealing with small sample sizes. RF models involve a hyperparameter-adjusting process to maximize the accuracy of the models. Thus, we optimized the models’ hyperparameters through 10-fold cross-validation. The only parameter in random forests that typically need optimization are the number of trees in the ensemble. We settled on a total of 200 decision trees for the model based on the stable results from primary validation. We also used the number of decision trees in related remote-sensing studies [50].
To analyze the special effects of our most interesting PIs we performed multiple iterations to deconstruct the model feature by feature. Firstly, we ranked each feature based on the Bayesian framework; therefore, the ranking result can be used as a feature reduction tool. Then, we removed the lowest relevant feature in each iteration. Eventually, we found the most sensitive features. Through iterations, we determined the minimum number of features to build a model while keeping the accuracy acceptable.

2.6. Model Evaluation

We built and evaluated all models by 10-fold cross-validation, and we used the mean results of cross-validation in the model comparisons. We used three commonly used indices (R2, RMSE, and NRMSE) to compare the performance of generated models The calculation equations of R2, RMSE, and NRMSE are as follows:
R 2 = 1 i = 1 n ( y i y i ) 2 i = 1 n ( y i y ¯ ) 2
R M S E = i = 1 n ( y i y i ) 2 n
N R M S E = R M S E N
where y i and y i are the measured and predicted values for sample i , respectively. y ¯ indicates the mean values and n is the number of samples used for calibration or validation set. N is the average value of the samples.

3. Results

3.1. Statistics of Crop Data and Their Relationships with Selected Bands and VIs

Field experiment results of the four parameters are summarized in Figure 4. Across the different stages, the AGB and PNA increased with a steady overall trend; however, noticeably, their growth rates peaked at different stages. AGB had rapid growth rates at different stages, e.g., ZS47~ZS65, and they were significantly different because of the start of the filling stages. PNA grew fast in the early stages of ZS33~ZS47 and they showed significant differences, indicating that plants absorb plenty of nitrogen at the jointing stage. As for PNC, it reached stabilization after decreasing at the early stages. PNC showed a significant downward trend in the early stages, especially ZS33 and ZS47. The NNI trend through the stages had little fluctuations for the various nitrogen treatments in this study, which shows it is a promising indicator for evaluating wheat N status.
Linear regressions were performed between biometrics and the reflectance of different bands, and the Red reflectance had the best relationship with all biometrics. Then, Green and Blue had relatively strong relationships with AGB, PNA, and NNI. For NIR, it had good relationships with PNA, PNC, and NNI. RE showed low relativity with the biometrics. From another view, PNC is hard to model from band reflectance, but the other three are comparatively easier. The highest R2 was 0.42 between the Red band reflectance and NNI.
Regression between 12 selected VIs and four biometrics is recorded in Table 3. Almost all the VIs were closely related to the biometrics, and the best determination coefficient was made by using NDRE for AGB, with an R2 of 0.64; overall, the VIs had the best relationships (0.34~0.63) with NNI, and were greatly related to AGB (0.1~0.64) and PNA (0.29~0.58), while they had a relatively lower R2 (0.01~0.39) between VIs and PNC.

3.2. Phenology Contribution in Estimating Crop Data

The results of Section 3.1. showed the best performing model was built by R reflectance. Therefore, it was selected along with the commonly used NDVI to plot Figure 5 and Figure 6. They showed the linear relationships of single factors with biometrics across different stages, and we noticed that there were clear phenology differentials. In Figure 5a, at different stages, the slope and intercept vary from −72.42~ to −18.56 and 4.33 to ~16.57, respectively. We show similar gaps in Figure 5b,c; however, in Figure 5d, the slope and intercept change slightly at different stages. Figure 5 shows it is inferable that the VIs have similar results, such as in NDVI and the drastic slope and intercept variations of AGB, PNA, and PNC; however, NNI shows little change. There is an apparent negative correlation between R band reflectance and crop parameter and a positive correlation between NDVI and crop parameters. This situation is because as crop growth increases, red light is increasingly absorbed by crop vegetation, with the NIR band showing a higher reflection rate. At late stages, mature crops tend to show low reflection rates in both bands.
Nitrogen treatment and phenology are the main factors affecting the four parameters, as shown in the three-way ANOVA (Table 4). Among the results, nitrogen treatment contributed mostly in NNI (72.64%), and the least in AGB (36.81%). Phenology contributed majorly in AGB and PNC but minorly in PNA (35.02%), and it contributed even less in NNI (16.04%). Besides the main factors, the interactions between nitrogen fertilizer and phenology also showed significant effects (5.88~12.22%) on the biometrics. From a statistical view, the ANOVA results strongly suggested that phenology plays an important role in modeling crop parameters.

3.3. RF Model Results of Different Combinations

3.3.1. Model Results Using All Features

The models were built for crop parameters according to the combinations mentioned in Table 2. In each combination, all features were used to build models. The results are in the heatmap of Figure 7, which is a comparison between different combinations. Generally, combinations with more features performed better. Figure 7a shows the data of R2 increase from C1 to C6. Figure 7b,c show it decreases from C1 to C6. The model with the poorest predictions was C1, which only uses band reflectance. The best results were C5 for PNC, with R2 = 0.77, RMSE = 0.2, and NMRSE = 10.32, and C5 for NNI, with R2 = 0.73, RMSE = 0.15, and NMRSE = 13.92, while C6 for AGB and PNA had the highest R2 and lowest RMSE or NRMSE.
Compared with C1, C2, and C3, the C4, C5, and C6 include PIs as part of their features, and the latter showed better estimations. In a comparison between C1 and C4, the PIs significantly improved the model that only used band reflectance for four crop parameters, such as R2 from 0.48~0.65 to 0.69~0.76 or NMRSE from 15.09~15.83 to 11.19~14.96. This showed that PIs had the potential to enhance the models of AGB, PNA, and PNC. Furthermore, in the comparison between C2 and C5, PIs only improved the VI model in predicting PNC (R2 from 0.66 to 0.77), and C6 performed better than C3 or C5, showing the best results using all features.
Overall, all models showed good estimations of the parameters, particularly C5 and C6, which showed great results for all crop parameters. C5 was the combination of VI and PI, and C6 was the combination of all three types of data. The best model for AGB was C6, and the best model for PNA, PNC, and NNI was C5. The results showed that AGB, PNA, and PNC were better estimated when integrated with PIs; however, NNI showed insensitivity to PIs.

3.3.2. Model Iteration and Feature Selection Results

In Figure 7, we show the integration of the C4, C5, and C6 models with PIs, and they showed a generally high performance, especially C6, which had the highest R2 (0.79). The comparisons between the results of C1 and C4 and C3 and C6 showed that phenology might play an important role in predicting key wheat growth parameters. Therefore, we disassembled the model to comprehend whether the models were enhanced by coupling phenology indicators. It was identified that C5 and C6 were the best models for four crop parameters because they had similar prediction abilities and C6 took all features in the model. Consequently, a feature-by-feature iteration was performed for C6, and the results are presented in Figure 8.
Of all crop parameter models presented in Figure 8, AGB and PNA predicted with fewer error ranges, while PNC and NNI had apparent error ranges. All models tended to show that stable estimations were acquired at more than three features. Therefore, we selected the model with four features for further analysis, and the details of four-feature models are presented in the text of the figures. The AGB and PNC models were integrated with PIs, while the PNA and NNI models were built by VIs. Overall, EVI2, NDRE, and PIs had the most appearances. However, the highest mean R2 varied when using a different number of features. For AGB models, R2 was the highest at four features, and they were B, EVI2, NDRE, and PIs. Additionally, in Figure 8c for PNC, three features of EVI, LCI, and PIs had the highest R2. For PNA models, although the model seemed stable at minimum features, the best R2 was at 11 features. The best models for NNI were built with two VIs of MNLI and NDRE. It is worth mentioning that PIs took great priority in the AGB and PNC models because the best model emerged with the appearance of PIs; however, PIs showed little significance in the PNA and NNI models.
We present the relative importance of different variables in Figure 9. The number of features in different parameters was set to four. In the models of AGB, PNA, PNC and NNI, and NDRE, the relative importance was 46.8%, 39.9%, 11.4%, and 42.7%, respectively. Our interested PIs had a relative importance of 10.1% and 47.9% in AGB and PNC, respectively. These results showed that PIs provide a good contribution in the model performances of AGB and PNC. Especially in PNC, PIs showed the highest importance among all features. Taken together in Figure 8 and Figure 9, it was obvious that PIs can significantly improve the model accuracy in few-parameter circumstances.

3.4. Model Validation and Spatial Results of UAV Data

After the iterations and best features were found, the model for the four parameters was evaluated using all data. The results we present in Figure 10 show that all parameters’ models showed great estimation accuracy. Among the four parameters, AGB and PNC were better estimated (R2 > 0.80), and NNI had the lowest R2 of 0.74. These results showed the model was effective and demonstrated the feasibility of integrating PIs into the machine learning model.
Among all models built in this study, the best models were used to build pixel-level spatial results of the four wheat parameters. The results of one key-stage anthesis (ZS65) are in Figure 11. The spatial results were highly consistent with the field experiment. The inside-plot variance of each treatment is low, and the difference between different-plot treatments is obvious. In this manner, the results were acquired ready for field-level precision fertilizing.

3.5. Model Accuracy in Different STAGES and N treatments

A further analysis was performed to specify how the model accuracy varied in different stages or N treatments, and all predication results from 3.4 were extracted and grouped by stages or N treatments. The stage results are presented in Figure 12. PNA, PNC, and NNI models showed higher R2 and lower RMSE at different stages, with an average R2 of 0.72, 0.70, and 0.75, respectively, and AGB showed a relatively lower average R2 of 0.61 and a higher RMSE. As the growth period advanced, the trends of the four parameters were different. The PNA and NNI models showed steady results, the AGB model had better estimations in the late stages, and conversely, the PNC model showed lower results in the late stages. Among all stages, ZS65 generally had high model accuracy.
Similarly, Figure 13 shows another analysis that was performed in a different N treatment. The models of AGB and PNC showed higher R2, while the PNA and NNI models showed lower performances at different N treatments. For all N treatments, N0, N2, and N3 showed an average R2 of 0.63, 0.68, and 0.68, respectively. The AGB model of N1 treatment showed great results; however, the other three models did not perform well in N1. Generally, the AGB model showed good results in all N treatments and the other models showed good performances in high N treatments. These results clarified that the models we built in this study have different accuracies among different stages and N treatments. This means the PIs can have different impacts on the model accuracy in terms of the experiment setup.

4. Discussion

4.1. Comparison between Models Using Band or VI

In this paper, we firstly evaluated the linear model built from bands and VIs. Generally, band reflectance models performed less effectively compared with the VI models, which is easy to understand because VI is the synthetic of multiple bands’ information; therefore, it can better perform than a single band. When putting more bands into consideration, the model performance is improved. Similar results were found in using bands’ information to model the rice leaf area index (LAI) [51]. As shown in Figure 5 and Figure 6a, a common situation is that the band or VI is easily saturated in the late growth stages due to high vegetation cover or biomass. The possible explanation is that the R band cannot penetrate deeper while the canopy is densely closed, and so are bands within the visible light. On the contrary, RE is less absorbed by the upper canopy, which means it can penetrate further into the canopy and carry more canopy information [13,52]. This physical limitation could be the major cause of the lower accuracy of models using bands.
In this paper, RF is also used to establish band and VI models to predict wheat growth status parameters. The model accuracy is significantly improved by only using band information. As the results in Figure 7 showed, C1 had the lowest accuracy. C2 and C3 showed great prediction abilities. The results of the saturation problem can be alleviated in RF models; meanwhile, in our models, NDRE and LCI were selected as features to predict AGB, PNC, and NNI, which could be a reasonable outcome because they were the indices using the RE band [12,53].

4.2. Integrating PIs into Crop Growth Monitoring Is Promising

At present, research has been performed to use texture information [8,54], RGB color features [32,55], crop height [56], meteorological factors, or soil data [57] as variables to build models for crop monitoring, along with other deep learning approaches, such as convolutional neural networks (CNNs) [58,59]. These methods successfully built the model for crop status monitoring; however, the models can be complicated and redundant. As a matter of fact, optical sensors tended to show stable variations of the reflectance gradient in one particular experiment; however, in several experiments across different growth stages, the stable outcome is affected by many factors, such as major plant growth status, soil conditions, and the interactions of these factors [30]. Hence, it is necessary to add phenology information in order to address the deviation of the sensor. In this paper, phenology information was added to the retrieval models and showed significant importance in the built models.
PI had been emphasized by many previous studies [29,60,61]. The merits of using PIs can be summarized as: (1) compared with other variables, they are easily acquired during the field experiments. Anyone can distinguish basic crop stages and record it by Zadok’s scale stages. (2) PIs can also help to build models with less features, thus reducing the computation time while maintaining model accuracy (Figure 8). (3) PIs are expandable because of their consistency in one particular region [62]. For further expansion, PIs can be converted to other parameters, such as growing degree days (GDD), days of year (DOY), and days after sowing (DAS) [60]. All these parameters are either of a meteorological or time-series type, meaning that they can be determined even without agronomy expertise knowledge.
Zadok’s stages or other stage codes, such as Feekes’ scale or BBCH [63], are abbreviated designations. There are alternatives for PIs; furthermore, in addition to the GDD, DOY, or DAS mentioned above, leaf age is another index that can detail all growth situations. Otherwise, there could be a normalized index known as the relative growth stage (RGS) [61]. The potential utilization of these indicators still needs to be examined.

4.3. Other Machine Learning Models Integrated with PIs

This paper concluded that RF models combined with PI could yield accurate predictions of crop growth status parameters. The RF can only represent the decision-tree-type machine learning model [23,32]. The other types of machine learning models integrated with PIs need to be tested. For example, partial least squares regression (PLSR) and support vector regression (SVR) are the most commonly used machine learning algorithms in current remote-sensing data interpretation [19,33,56,64]. All combinations in this research were used to build models using different methods, and the different machine learning models had the same improved results when considering PIs in the models.
The results showed similar trends compared with RF models. C1 showed lower accuracy, while models that used C4 showed better performance, of which is the combination of C1 and PIs. Likewise, slight improvements were observed between C2 and C5 or C3 and C6. To summarize, the models consisting of band reflectance can be improved greatly, and the models consisting of VI can be improved slightly. The detailed results are presented in Table 5 and Table 6:

5. Conclusions

In this paper, we examined the linear relationships between remote-sensing indices and found that verified crop growth parameters were significantly affected by phenology. Statistically, phenology had a 16.04–49.87% contribution to different crop variables. Therefore, PIs were integrated into the random forest model to evaluate if it could improve the model’s prediction ability. The results showed that most models can provide accurate predictions of the selected parameters with R2 > 0.7, and PIs were important in the built models. The optimized RF model with best performance was analyzed using different nitrogen treatments or stages. The results were varied in different growth variables, with an average R2 ranging from 0.61 to 0.75 in different stages. These results indicated that the phenology needed to be considered in future studies. Additionally, when using different types of remotely sensed data, PIs had different adaptations to the model effectiveness, and to better understand the insertion of PIs, more studies coupling with different PI datatypes need to be conducted. Future crop monitoring work needs to consider crop phenology and the possible ways of transforming phenology, as well as introducing other kinds of advanced machine learning regression methods into this subject.

Author Contributions

Conceptualization, S.H.; data curation, S.H. and Y.Z.; investigation, H.F.; methodology, Z.L. and J.C.; software, J.C.; validation, S.H. and Y.Z.; formal analysis, F.Z., H.Y. and X.M.; writing—original draft preparation, S.H. and Y.Z.; supervision and review, G.Y. and C.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by the Natural Science Foundation of China (42171303), Key scientific and technological projects of Heilongjiang province (2021ZXJ05A05), Chongqing Technology Innovation and Application Development Special Project (cstc2019jscx-gksbX0092, cstc2021jscx-gksbX0064), and the National Key Research and Development Program of China (2019YFE0125300).

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like thank Hong Chang and Weiguo Li for acquiring data in the field experiments of this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zörb, C.; Ludewig, U.; Hawkesford, M.J. Perspective on wheat yield and quality with reduced nitrogen supply. Trends Plant Sci. 2018, 23, 1029–1037. [Google Scholar] [CrossRef] [Green Version]
  2. Berger, K.; Verrelst, J.; Féret, J.-B.; Wang, Z.; Wocher, M.; Strathmann, M.; Danner, M.; Mauser, W.; Hank, T. Crop nitrogen monitoring: Recent progress and principal developments in the context of imaging spectroscopy missions. Remote Sens. Environ. 2020, 242, 111758. [Google Scholar] [CrossRef]
  3. Gao, Z.; Wang, C.; Zhao, J.; Wang, K.; Shang, M.; Qin, Y.; Bo, X.; Chen, F.; Chu, Q. Adopting different irrigation and nitrogen management based on precipitation year types balances winter wheat yields and greenhouse gas emissions. Field Crops Res. 2022, 280, 108484. [Google Scholar] [CrossRef]
  4. Subbarao, G.; Searchinger, T.D. Opinion: A “more ammonium solution” to mitigate nitrogen pollution and boost crop yields. Proc. Natl. Acad. Sci. USA 2021, 118, e2107576118. [Google Scholar] [CrossRef]
  5. Pullanagari, R.R.; Dehghan-Shoar, M.; Yule, I.J.; Bhatia, N. Field spectroscopy of canopy nitrogen concentration in temperate grasslands using a convolutional neural network. Remote Sens. Environ. 2021, 257, 112353. [Google Scholar] [CrossRef]
  6. Lemaire, G.; Meynard, J.M. Use of the nitrogen nutrition index for the analysis of agronomical data. In Diagnosis of the Nitrogen Status in Crops; Springer: Berlin/Heidelberg, Germany, 1997; pp. 45–55. [Google Scholar] [CrossRef]
  7. Justes, E.; Mary, B.; Meynard, J.-M.; Machet, J.-M.; Thelier-Huché, L. Determination of a critical nitrogen dilution curve for winter wheat crops. Ann. Bot. 1994, 74, 397–407. [Google Scholar] [CrossRef]
  8. Zheng, H.; Cheng, T.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Combining Unmanned Aerial Vehicle (UAV)-Based Multispectral Imagery and Ground-Based Hyperspectral Data for Plant Nitrogen Concentration Estimation in Rice. Front. Plant Sci. 2018, 9, 936. [Google Scholar] [CrossRef]
  9. Fu, Y.; Yang, G.; Li, Z.; Li, H.; Li, Z.; Xu, X.; Song, X.; Zhang, Y.; Duan, D.; Zhao, C.; et al. Progress of hyperspectral data processing and modelling for cereal crop nitrogen monitoring. Comput. Electron. Agric. 2020, 172, 105321. [Google Scholar] [CrossRef]
  10. Verrelst, J.; Camps-Valls, G.; Muñoz-Marí, J.; Rivera, J.P.; Veroustraete, F.; Clevers, J.G.P.W.; Moreno, J. Optical remote sensing and the retrieval of terrestrial vegetation bio-geophysical properties—A review. ISPRS J. Photogramm. Remote Sens. 2015, 108, 273–290. [Google Scholar] [CrossRef]
  11. Jin, X.; Zarco-Tejada, P.J.; Schmidhalter, U.; Reynolds, M.P.; Hawkesford, M.J.; Varshney, R.K.; Yang, T.; Nie, C.; Li, Z.; Ming, B. High-throughput estimation of crop traits: A review of ground and aerial phenotyping platforms. IEEE Geosci. Remote Sens. Mag. 2020, 9, 200–231. [Google Scholar] [CrossRef]
  12. Wang, F.; Yang, M.; Ma, L.; Zhang, T.; Qin, W.; Li, W.; Zhang, Y.; Sun, Z.; Wang, Z.; Li, F.; et al. Estimation of Above-Ground Biomass of Winter Wheat Based on Consumer-Grade Multi-Spectral UAV. Remote Sens. 2022, 14, 1251. [Google Scholar] [CrossRef]
  13. Jiang, J.; Zhang, Z.; Cao, Q.; Liang, Y.; Krienke, B.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Use of an Active Canopy Sensor Mounted on an Unmanned Aerial Vehicle to Monitor the Growth and Nitrogen Status of Winter Wheat. Remote Sens. 2020, 12, 3684. [Google Scholar] [CrossRef]
  14. Song, X.; Yang, G.; Xu, X.; Zhang, D.; Yang, C.; Feng, H. Winter Wheat Nitrogen Estimation Based on Ground-Level and UAV-Mounted Sensors. Sensors 2022, 22, 549. [Google Scholar] [CrossRef]
  15. Fu, Y.; Yang, G.; Li, Z.; Song, X.; Li, Z.; Xu, X.; Wang, P.; Zhao, C. Winter Wheat Nitrogen Status Estimation Using UAV-Based RGB Imagery and Gaussian Processes Regression. Remote Sens. 2020, 12, 3778. [Google Scholar] [CrossRef]
  16. Lu, N.; Zhou, J.; Han, Z.; Li, D.; Cao, Q.; Yao, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cheng, T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods 2019, 15, 17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Li, Z.; Jin, X.; Yang, G.; Drummond, J.; Yang, H.; Clark, B.; Li, Z.; Zhao, C. Remote Sensing of Leaf and Canopy Nitrogen Status in Winter Wheat (Triticum aestivum L.) Based on N-PROSAIL Model. Remote Sens. 2018, 10, 1463. [Google Scholar] [CrossRef] [Green Version]
  18. Verrelst, J.; Malenovský, Z.; Van der Tol, C.; Camps-Valls, G.; Gastellu-Etchegorry, J.-P.; Lewis, P.; North, P.; Moreno, J. Quantifying Vegetation Biophysical Variables from Imaging Spectroscopy Data: A Review on Retrieval Methods. Surv. Geophys. 2019, 40, 589–629. [Google Scholar] [CrossRef] [Green Version]
  19. Qiu, Z.; Ma, F.; Li, Z.; Xu, X.; Ge, H.; Du, C. Estimation of nitrogen nutrition index in rice from UAV RGB images coupled with machine learning algorithms. Comput. Electron. Agric. 2021, 189, 106421. [Google Scholar] [CrossRef]
  20. Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  21. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  22. Wang, S.; Azzari, G.; Lobell, D.B. Crop type mapping without field-level labels: Random Forest transfer and unsupervised clustering techniques. Remote Sens. Environ. 2019, 222, 303–317. [Google Scholar] [CrossRef]
  23. Yang, S.; Hu, L.; Wu, H.; Ren, H.; Qiao, H.; Li, P.; Fan, W. Integration of Crop Growth Model and Random Forest for Winter Wheat Yield Estimation from UAV Hyperspectral Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 6253–6269. [Google Scholar] [CrossRef]
  24. Prado Osco, L.; Marques Ramos, A.P.; Roberto Pereira, D.; Akemi Saito Moriya, É.; Nobuhiro Imai, N.; Takashi Matsubara, E.; Estrabis, N.; de Souza, M.; Marcato Junior, J.; Gonçalves, W.N.; et al. Predicting Canopy Nitrogen Content in Citrus-Trees Using Random Forest Algorithm Associated to Spectral Vegetation Indices from UAV-Imagery. Remote Sens. 2019, 11, 2925. [Google Scholar] [CrossRef] [Green Version]
  25. Zha, H.; Miao, Y.; Wang, T.; Li, Y.; Zhang, J.; Sun, W.; Feng, Z.; Kusnierek, K. Improving Unmanned Aerial Vehicle Remote Sensing-Based Rice Nitrogen Nutrition Index Prediction with Machine Learning. Remote Sens. 2020, 12, 215. [Google Scholar] [CrossRef] [Green Version]
  26. Jacquemoud, S.; Verhoef, W.; Baret, F.; Bacour, C.; Zarco-Tejada, P.J.; Asner, G.P.; François, C.; Ustin, S.L. PROSPECT+SAIL models: A review of use for vegetation characterization. Remote Sens. Environ. 2009, 113, S56–S66. [Google Scholar] [CrossRef]
  27. Boogaard, H.; Van Diepen, C.; Rotter, R.; Cabrera, J.; Van Laar, H. WOFOST 7.1; user’s guide for the WOFOST 7.1 crop growth simulation model and WOFOST Control Center 1.5. Ann. Appl. Biol. 1998. [Google Scholar]
  28. Li, P.; Zhang, X.; Wang, W.; Zheng, H.; Yao, X.; Tian, Y.; Zhu, Y.; Cao, W.; Chen, Q.; Cheng, T. Estimating aboveground and organ biomass of plant canopies across the entire season of rice growth with terrestrial laser scanning. Int. J. Appl. Earth Obs. Geoinf. 2020, 91, 102132. [Google Scholar] [CrossRef]
  29. Guo, Y.; Fu, Y.; Hao, F.; Zhang, X.; Wu, W.; Jin, X.; Robin Bryant, C.; Senthilnath, J. Integrated phenology and climate in rice yields prediction using machine learning methods. Ecol. Indic. 2021, 120, 106935. [Google Scholar] [CrossRef]
  30. Li, Z.; Taylor, J.; Yang, H.; Casa, R.; Jin, X.; Li, Z.; Song, X.; Yang, G. A hierarchical interannual wheat yield and grain protein prediction model using spectral vegetative indices and meteorological data. Field Crops Res. 2020, 248, 107711. [Google Scholar] [CrossRef]
  31. Wang, L.; Chen, S.; Li, D.; Wang, C.; Jiang, H.; Zheng, Q.; Peng, Z. Estimation of Paddy Rice Nitrogen Content and Accumulation Both at Leaf and Plant Levels from UAV Hyperspectral Imagery. Remote Sens. 2021, 13, 2956. [Google Scholar] [CrossRef]
  32. Ge, H.; Xiang, H.; Ma, F.; Li, Z.; Qiu, Z.; Tan, Z.; Du, C. Estimating Plant Nitrogen Concentration of Rice through Fusing Vegetation Indices and Color Moments Derived from UAV-RGB Images. Remote Sens. 2021, 13, 1620. [Google Scholar] [CrossRef]
  33. Li, D.; Miao, Y.; Gupta, S.K.; Rosen, C.J.; Yuan, F.; Wang, C.; Wang, L.; Huang, Y. Improving Potato Yield Prediction by Combining Cultivar Information and UAV Remote Sensing Data Using Machine Learning. Remote Sens. 2021, 13, 3322. [Google Scholar] [CrossRef]
  34. Dong, R.; Miao, Y.; Wang, X.; Yuan, F.; Kusnierek, K. Canopy Fluorescence Sensing for In-Season Maize Nitrogen Status Diagnosis. Remote Sens. 2021, 13, 5141. [Google Scholar] [CrossRef]
  35. Zhang, X.; Han, L.; Sobeih, T.; Lappin, L.; Lee, M.A.; Howard, A.; Kisdi, A. The Self-Supervised Spectral and Spatial Vision Transformer Network for Accurate Prediction of Wheat Nitrogen Status from UAV Imagery. Remote Sens. 2022, 14, 1400. [Google Scholar] [CrossRef]
  36. Lemaire, G.; Jeuffroy, M.-H.; Gastal, F. Diagnosis tool for plant and crop N status in vegetative stage: Theory and practices for crop N management. Eur. J. Agron. 2008, 28, 614–624. [Google Scholar] [CrossRef]
  37. Zhao, Y.; Chen, P.; Li, Z.; Casa, R.; Feng, H.; Yang, G.; Yang, W.; Wang, J.; Xu, X. A modified critical nitrogen dilution curve for winter wheat to diagnose nitrogen status under different nitrogen and irrigation rates. Front. Plant Sci. 2020, 11, 549636. [Google Scholar] [CrossRef]
  38. Daughtry, C.S.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey Iii, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  39. Verstraete, M.M.; Pinty, B.; Myneni, R.B. Potential and limitations of information extraction on the terrestrial biosphere from satellite remote sensing. Remote Sens. Environ. 1996, 58, 201–214. [Google Scholar] [CrossRef]
  40. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  41. Datt, B. A New Reflectance Index for Remote Sensing of Chlorophyll Content in Higher Plants: Tests using Eucalyptus Leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
  42. Peng, G.; Ruiliang, P.; Biging, G.S.; Larrieu, M.R. Estimation of forest leaf area index using vegetation indices derived from Hyperion hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1355–1362. [Google Scholar] [CrossRef] [Green Version]
  43. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  44. Chen, J.M. Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications. Can. J. Remote Sens. 1996, 22, 229–242. [Google Scholar] [CrossRef]
  45. Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. [Google Scholar]
  46. Rouse, J.; Haas, R.; Schell, J.; Deering, D. Monitoring vegetation systems in the great plains with ERTS proceeding. In Proceedings of the Third Earth Reserves Technology Satellite Symposium, Greenbelt: NASA SP-351, Washington, DC, USA, 10–14 December 1973. [Google Scholar]
  47. Pearson, R.L.; Miller, L.D. Remote mapping of standing crop biomass for estimation of the productivity of the shortgrass prairie. Remote Sens. Environ. 1972, VIII, 1355. [Google Scholar]
  48. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  49. Zadoks, J.C.; Chang, T.T.; Konzak, C.F. A decimal code for the growth stages of cereals. Weed Res. 1974, 14, 415–421. [Google Scholar] [CrossRef]
  50. Feng, Q.; Liu, J.; Gong, J. UAV remote sensing for urban vegetation mapping using random forest and texture analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef] [Green Version]
  51. Liu, S.; Zeng, W.; Wu, L.; Lei, G.; Chen, H.; Gaiser, T.; Srivastava, A.K. Simulating the Leaf Area Index of Rice from Multispectral Images. Remote Sens. 2021, 13, 3663. [Google Scholar] [CrossRef]
  52. Kanke, Y.; Raun, W.; Solie, J.; Stone, M.; Taylor, R. Red edge as a potential index for detecting differences in plant nitrogen status in winter wheat. J. Plant Nutr. 2012, 35, 1526–1541. [Google Scholar] [CrossRef]
  53. Wang, Y.; Zhang, K.; Tang, C.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Estimation of rice growth parameters based on linear mixed-effect model using multispectral images from fixed-wing unmanned aerial vehicles. Remote Sens. 2019, 11, 1371. [Google Scholar] [CrossRef] [Green Version]
  54. Zhang, J.; Cheng, T.; Shi, L.; Wang, W.; Niu, Z.; Guo, W.; Ma, X. Combining spectral and texture features of UAV hyperspectral images for leaf nitrogen content monitoring in winter wheat. Int. J. Remote Sens. 2022, 43, 2335–2356. [Google Scholar] [CrossRef]
  55. Zhao, B.; Zhang, Y.; Duan, A.; Liu, Z.; Xiao, J.; Liu, Z.; Qin, A.; Ning, D.; Li, S.; Ata-Ul-Karim, S.T. Estimating the Growth Indices and Nitrogen Status Based on Color Digital Image Analysis During Early Growth Period of Winter Wheat. Front. Plant Sci. 2021, 12, 619522. [Google Scholar] [CrossRef]
  56. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of Winter Wheat Above-Ground Biomass Using Unmanned Aerial Vehicle-Based Snapshot Hyperspectral Sensor and Crop Height Improved Models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef] [Green Version]
  57. Fu, Z.; Yu, S.; Zhang, J.; Xi, H.; Gao, Y.; Lu, R.; Zheng, H.; Zhu, Y.; Cao, W.; Liu, X. Combining UAV multispectral imagery and ecological factors to estimate leaf nitrogen and grain protein content of wheat. Eur. J. Agron. 2022, 132, 126405. [Google Scholar] [CrossRef]
  58. Dong, L.; Du, H.; Han, N.; Li, X.; Zhu, D.E.; Mao, F.; Zhang, M.; Zheng, J.; Liu, H.; Huang, Z.; et al. Application of Convolutional Neural Network on Lei Bamboo Above-Ground-Biomass (AGB) Estimation Using Worldview-2. Remote Sens. 2020, 12, 958. [Google Scholar] [CrossRef] [Green Version]
  59. Yang, Q.; Shi, L.; Han, J.; Yu, J.; Huang, K. A near real-time deep learning approach for detecting rice phenology based on UAV images. Agric. For. Meteorol. 2020, 287, 107938. [Google Scholar] [CrossRef]
  60. Li, Z.; Zhao, Y.; Taylor, J.; Gaulton, R.; Jin, X.; Song, X.; Li, Z.; Meng, Y.; Chen, P.; Feng, H.; et al. Comparison and transferability of thermal, temporal and phenological-based in-season predictions of above-ground biomass in wheat crops from proximal crop reflectance data. Remote Sens. Environ. 2022, 273, 112967. [Google Scholar] [CrossRef]
  61. Yang, Q.; Shi, L.; Han, J.; Chen, Z.; Yu, J. A VI-based phenology adaptation approach for rice crop monitoring using UAV multispectral images. Field Crops Res. 2022, 277, 108419. [Google Scholar] [CrossRef]
  62. Yang, X.; Guo, R.; Knops, J.M.H.; Mei, L.; Kang, F.; Zhang, T.; Guo, J. Shifts in plant phenology induced by environmental changes are small relative to annual phenological variation. Agric. For. Meteorol. 2020, 294, 108144. [Google Scholar] [CrossRef]
  63. Large, E.C. Growth stages in cereals. Illustration of the Feekes scale. Plant Pathol. 1954, 3, 128–129. [Google Scholar] [CrossRef]
  64. Bian, C.; Shi, H.; Wu, S.; Zhang, K.; Wei, M.; Zhao, Y.; Sun, Y.; Zhuang, H.; Zhang, X.; Chen, S. Prediction of Field-Scale Wheat Yield Using Machine Learning Method and Multi-Spectral UAV Data. Remote Sens. 2022, 14, 1474. [Google Scholar] [CrossRef]
Figure 1. Location of Xiaotangshan National Experiment Station for Precision Agriculture and layout of experiment plots.
Figure 1. Location of Xiaotangshan National Experiment Station for Precision Agriculture and layout of experiment plots.
Remotesensing 14 03723 g001
Figure 2. Average temperature and precipitation during growing season.
Figure 2. Average temperature and precipitation during growing season.
Remotesensing 14 03723 g002
Figure 3. DJI P4M and sensor properties used in study.
Figure 3. DJI P4M and sensor properties used in study.
Remotesensing 14 03723 g003
Figure 4. Wheat growth data collected at five stages: (a) AGB, (b) PNA, (c) PNC, (d) NNI. The letters on top of box are significance marks.
Figure 4. Wheat growth data collected at five stages: (a) AGB, (b) PNA, (c) PNC, (d) NNI. The letters on top of box are significance marks.
Remotesensing 14 03723 g004
Figure 5. Linear relationships between R-band reflectance and crop parameters at all phenologies: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Figure 5. Linear relationships between R-band reflectance and crop parameters at all phenologies: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Remotesensing 14 03723 g005
Figure 6. Linear relationships between NDVI and crop parameters at all phenologies: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Figure 6. Linear relationships between NDVI and crop parameters at all phenologies: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Remotesensing 14 03723 g006
Figure 7. RF modeling results for different combinations: (a) R2, (b) RMSE, (c) NRMSE.
Figure 7. RF modeling results for different combinations: (a) R2, (b) RMSE, (c) NRMSE.
Remotesensing 14 03723 g007
Figure 8. R2 of model iteration in the best model: (a) AGB, (b) PNA, (c) PNC, (d) NNI; gray area, red line, and blue line represent the SD, min, and max of 10-fold cross-validation, respectively. The green dotted lines were models with 4 features, and texts along them are features of this particular model.
Figure 8. R2 of model iteration in the best model: (a) AGB, (b) PNA, (c) PNC, (d) NNI; gray area, red line, and blue line represent the SD, min, and max of 10-fold cross-validation, respectively. The green dotted lines were models with 4 features, and texts along them are features of this particular model.
Remotesensing 14 03723 g008
Figure 9. Relative importance of different variables in best models (%): (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Figure 9. Relative importance of different variables in best models (%): (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Remotesensing 14 03723 g009
Figure 10. Model evaluation using all data. (a) AGB, (b) PNA, (c) PNC, (d) NNI. The 1:1 lines and fitting curves were plotted as black and green lines, respectively.
Figure 10. Model evaluation using all data. (a) AGB, (b) PNA, (c) PNC, (d) NNI. The 1:1 lines and fitting curves were plotted as black and green lines, respectively.
Remotesensing 14 03723 g010
Figure 11. Spatial results of four crop parameters using best models at ZS65: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Figure 11. Spatial results of four crop parameters using best models at ZS65: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Remotesensing 14 03723 g011
Figure 12. Model accuracy comparison at different stages: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Figure 12. Model accuracy comparison at different stages: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Remotesensing 14 03723 g012
Figure 13. Model accuracy comparison in different N treatments: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Figure 13. Model accuracy comparison in different N treatments: (a) AGB, (b) PNA, (c) PNC, (d) NNI.
Remotesensing 14 03723 g013
Table 1. List of used features in this study.
Table 1. List of used features in this study.
Data TypeFeaturesAcronymEquationReference
Original ReflectanceBlue Band ReflectanceBReflectance of B band/
Green Band ReflectanceGReflectance of G band/
Red Band ReflectanceRReflectance of R band/
RedEdge Band ReflectanceREReflectance of RE band/
Near-Infrared Band ReflectanceNIRReflectance of NIR band/
Vegetation IndicesDifference Vegetation IndexDVINIRR[38]
Enhanced Vegetation IndexEVI 2.5 × N I R R E D N I R + 6 × R 7.5 × B + 1 [39]
Enhanced Vegetation Index 2EVI2 2.4 × N I R R N I R + R + 1 [40]
Leaf Chlorophyll IndexLCI N I R R E N I R + R [41]
Modified Chlorophyll Absorbtion Ratio IndexMCARI ( ( R E R ) ( 0.2 × ( R E G ) ) ) × R E R [38]
Modified Non-Linear IndexMNLI 1.5 × N I R 2 1.5 × G N I R 2 + R + 0.5 [42]
Modified Soil-Adjusted Vegetation IndexMSAVI 2 × N I R + 1 ( 2 × N I R ) 2 8 × ( N I R R E D ) 2 [43]
Modified Simple Ratio IndexMSR N I R R 1 N I R R + 1 [44]
Normalized Difference Red-EdgeNDRE N I R R E N I R + R E [45]
Normalized Difference Vegetation IndexNDVI N I R R N I R + R [46]
Ratio Vegetation IndexRVI N I R R [47]
Soil-Adjusted Vegetation IndexSAVI N I R R N I R + R + 0.5 × ( 1 + 0.5 ) [48]
Phenology IndicatorsPhenology IndicatorsPI33 (jointing stage), 47 (flag leaf stage), 65 (anthesis), 75 (early filling), 80 (late filling)[49]
Table 2. Combinations of features in RF models.
Table 2. Combinations of features in RF models.
BandVIPI
C1
C2
C3
C4
C5
C6
Table 3. Coefficient of determination (R2) between biometrics and reflectance or vegetation indices.
Table 3. Coefficient of determination (R2) between biometrics and reflectance or vegetation indices.
FeatureAGBPNAPNCNNI
Original ReflectanceBlue0.21 **0.17 **0.00 ns0.13 **
Green0.22 **0.25 **0.08 **0.26 **
Red0.25 **0.35 **0.19 **0.42 **
RE0.11 **0.07 **0.00 ns0.03 *
NIR0.03 *0.14 **0.21 **0.24 **
Vegetation IndicesDVI0.16 **0.35 **0.35 **0.52 **
EVI0.1 **0.29 **0.39 **0.48 **
EVI20.38 **0.5 **0.06 **0.52 **
LCI0.28 **0.33 **0.08 **0.34 **
MCARI0.21 **0.37 **0.29 **0.49 **
MNLI0.22 **0.43 **0.33 **0.59 **
MSAVI0.23 **0.45 **0.36 **0.61 **
MSR0.31 **0.52 **0.29 **0.63 **
NDRE0.64 **0.58 **0.01 ns0.43 **
NDVI0.3 **0.49 **0.32 **0.61 **
RVI0.3 **0.5 **0.26 **0.6 **
SAVI0.23 **0.45 **0.36 **0.61 **
Significance level: ns = not significant, * p < 0.05, ** p < 0.01.
Table 4. A three-way ANOVA for effects of wheat variety (V), nitrogen fertilizer (N), and phenology (P) on AGB, PNA, PNC, and NNI.
Table 4. A three-way ANOVA for effects of wheat variety (V), nitrogen fertilizer (N), and phenology (P) on AGB, PNA, PNC, and NNI.
FactorAGBPNAPNCNNI
p ValueContributionp ValueContributionp ValueContributionp ValueContribution
V0.9410.00%0.6540.15%0.1141.22%0.4280.59%
N0.00036.81%0.00051.53%0.00041.54%0.00072.64%
P0.00049.87%0.00035.02%0.00040.22%0.00116.04%
V*N0.8190.64%0.6161.35%0.2372.04%0.5931.75%
V*P0.9900.21%0.9750.36%0.9630.29%0.9460.69%
N*P0.12310.96%0.2879.78%0.00412.22%0.8735.88%
V*N*P0.9991.52%0.9981.80%0.9472.45%0.9972.42%
Table 5. PLSR results for different combinations.
Table 5. PLSR results for different combinations.
R2RMSENRMSE
AGBPNAPNCNNIAGBPNAPNCNNIAGBPNAPNCNNI
C10.620.670.420.682.2533.960.330.1614.6514.1316.4415.15
C20.810.780.760.741.5727.90.210.1410.2411.6110.3813.61
C30.810.780.780.751.5827.530.200.1410.3111.4510.0013.28
C40.740.690.670.671.8633.020.250.1612.1515.9312.3615.34
C50.820.780.740.741.5627.750.210.1410.1511.5510.8313.57
C60.820.790.750.761.5627.060.210.1410.1611.2610.6213.18
Table 6. SVR results for different combinations.
Table 6. SVR results for different combinations.
R2RMSENRMSE
AGBPNAPNCNNIAGBPNAPNCNNIAGBPNAPNCNNI
C10.450.460.390.682.7945.520.340.1618.1918.9417.0415.28
C20.810.740.760.741.5830.480.210.1410.3112.6810.5813.71
C30.810.740.770.761.5930.260.200.1410.4012.5910.2013.25
C40.710.630.660.692.0038.270.250.1613.0213.7412.5315.13
C50.810.740.750.741.5730.120.210.1410.2612.5310.7513.73
C60.810.750.770.771.5729.680.20.1410.2212.3510.2812.87
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Han, S.; Zhao, Y.; Cheng, J.; Zhao, F.; Yang, H.; Feng, H.; Li, Z.; Ma, X.; Zhao, C.; Yang, G. Monitoring Key Wheat Growth Variables by Integrating Phenology and UAV Multispectral Imagery Data into Random Forest Model. Remote Sens. 2022, 14, 3723. https://doi.org/10.3390/rs14153723

AMA Style

Han S, Zhao Y, Cheng J, Zhao F, Yang H, Feng H, Li Z, Ma X, Zhao C, Yang G. Monitoring Key Wheat Growth Variables by Integrating Phenology and UAV Multispectral Imagery Data into Random Forest Model. Remote Sensing. 2022; 14(15):3723. https://doi.org/10.3390/rs14153723

Chicago/Turabian Style

Han, Shaoyu, Yu Zhao, Jinpeng Cheng, Fa Zhao, Hao Yang, Haikuan Feng, Zhenhai Li, Xinming Ma, Chunjiang Zhao, and Guijun Yang. 2022. "Monitoring Key Wheat Growth Variables by Integrating Phenology and UAV Multispectral Imagery Data into Random Forest Model" Remote Sensing 14, no. 15: 3723. https://doi.org/10.3390/rs14153723

APA Style

Han, S., Zhao, Y., Cheng, J., Zhao, F., Yang, H., Feng, H., Li, Z., Ma, X., Zhao, C., & Yang, G. (2022). Monitoring Key Wheat Growth Variables by Integrating Phenology and UAV Multispectral Imagery Data into Random Forest Model. Remote Sensing, 14(15), 3723. https://doi.org/10.3390/rs14153723

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop