Next Article in Journal
Evaluation of Evapotranspiration for Exorheic Catchments of China during the GRACE Era: From a Water Balance Perspective
Previous Article in Journal
Validation of the SNTHERM Model Applied for Snow Depth, Grain Size, and Brightness Temperature Simulation at Meteorological Stations in China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle

1
National Engineering and Technology Center for Information Agriculture, Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture, Jiangsu Key Laboratory for Information Agriculture, Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing Agricultural University, 1 Weigang Road, Nanjing 210095, China
2
Department of Agronomy and Horticulture, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
*
Author to whom correspondence should be addressed.
Submission received: 6 January 2020 / Revised: 30 January 2020 / Accepted: 1 February 2020 / Published: 5 February 2020
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
Leaf area index (LAI) and leaf dry matter (LDM) are important indices of crop growth. Real-time, nondestructive monitoring of crop growth is instructive for the diagnosis of crop growth and prediction of grain yield. Unmanned aerial vehicle (UAV)-based remote sensing is widely used in precision agriculture due to its unique advantages in flexibility and resolution. This study was carried out on wheat trials treated with different nitrogen levels and seeding densities in three regions of Jiangsu Province in 2018–2019. Canopy spectral images were collected by the UAV equipped with a multi-spectral camera during key wheat growth stages. To verify the results of the UAV images, the LAI, LDM, and yield data were obtained by destructive sampling. We extracted the wheat canopy reflectance and selected the best vegetation index for monitoring growth and predicting yield. Simple linear regression (LR), multiple linear regression (MLR), stepwise multiple linear regression (SMLR), partial least squares regression (PLSR), artificial neural network (ANN), and random forest (RF) modeling methods were used to construct a model for wheat yield estimation. The results show that the multi-spectral camera mounted on the multi-rotor UAV has a broad application prospect in crop growth index monitoring and yield estimation. The vegetation index combined with the red edge band and the near-infrared band was significantly correlated with LAI and LDM. Machine learning methods (i.e., PLSR, ANN, and RF) performed better for predicting wheat yield. The RF model constructed by normalized difference vegetation index (NDVI) at the jointing stage, heading stage, flowering stage, and filling stage was the optimal wheat yield estimation model in this study, with an R2 of 0.78 and relative root mean square error (RRMSE) of 0.1030. The results provide a theoretical basis for monitoring crop growth with a multi-rotor UAV platform and explore a technical method for improving the precision of yield estimation.

Graphical Abstract

1. Introduction

Remote sensing platforms for crop growth monitoring and yield estimation mainly include ground, low altitude (unmanned aerial vehicle, UAV), and high altitude (satellite, aerospace). Ground remote sensing, such as analytical spectral devices (ASD), has the characteristics of easy use, multiple bands, and high resolution, but it is time-consuming, labor-intensive, and has low operation efficiency for monitoring over a wide area. High-altitude remote sensing, such as MODIS and Landsat satellites, is suitable for large-scale monitoring, but has a low resolution, a long return period, and is susceptible to weather [1]. UAVs have developed rapidly since the end of the 20th century. With the improvement of agricultural remote sensing, UAV remote sensing has been quickly applied to practice [2]. Field growth information acquired based on UAV platforms mainly includes vegetation coverage monitoring, growth monitoring, and yield estimation. UAVs used to acquire field information are divided into fixed-wing UAVs and multi-rotor UAVs. Multi-rotor UAV can adjust the flight elevation as needed, making them convenient and flexible to fly, while improving the efficiency and accuracy. To some extent, UAVs have overcome the deficiencies of ground remote sensing and high-altitude remote sensing, providing strong support for crop information monitoring technology of precision agriculture [3].
Varieties of sensors are widely used in crop canopy-scale spectral monitoring by handheld or airborne means. Developed by the National Information Agriculture Engineering Technology Center, the portable multi-spectral growth monitoring diagnostic instrument CGMD302 can quickly measure the spectral reflectivity of a crop canopy at 720 and 810 nm; this instrument has already been applied to monitor the growth of rice, wheat, and other crops [4]. GreenSeeker, a handheld active spectrometer developed by Trimble Navigation Limited, has two fixed bands of red light and near-infrared, which can construct normalized difference vegetation index (NDVI) and ratio Vegetation Index (RVI), and be applied in the growth monitoring of rice, wheat, corn, and other crops [5,6]. These hand-held sensors have fewer wavebands, building a limited spectral index; this makes it difficult to determine fields with partial vegetation index saturation in the late growth period of crops. With the development of UAV technology, an increasing number of sensors have been mounted and applied in UAV remote sensing. For example, Yang et al. [7] adopted rotor-UAVs equipped with a digital camera and multi-spectral camera thermal imager to assist wheat breeding, including the acquisition of crop lodging area, leaf area index (LAI), and canopy temperature in a plot. Tian et al. [8] used the imaging spectral sensor mounted on an eight-rotor UAV to obtain the hyperspectral image data of farmland for an inversion estimation of cotton LAI. Zhao et al. [9] used an eight-rotor UAV mounted imaging hyperspectral sensor to obtain images of key growth periods of soybean and accurately estimate soybean yield. Many other studies have been conducted on crop growth monitoring and yield estimation based on UAV remote sensing [10,11,12]. However, due to the limitations of the UAV platform and sensor technology, the pre-processing of remote sensing images is not yet straightforward, leaving room for improvement of the accuracy of monitoring and estimation models.
LAI and leaf dry matter (LDM) are important physiological parameters to describe the growth model of crops, as well as important indexes to reflect the growth status of crops [13]. Many studies have been conducted on the spectral data monitoring the growth indicators, such as LAI, based on UAV platforms. For example, Córcoles et al. [14] analyzed the correlation between LAI and canopy coverage with three models based on the quadrotor UAV platform equipped with a digital camera. Gao et al. [15] used a multi-rotor UAV with a mounted imaging spectral sensor to obtain spectral data of wheat at various growth stages and analyzed its correlation with LAI. Aasen et al. [16] used a UAV to carry hyperspectral cameras to test different varieties to prove the feasibility of LAI monitoring in the context of precision agriculture.
Rapid and non-destructive estimation of crop yield is an important part of agricultural remote sensing. Remote sensing technology has been widely used in crop growth monitoring and yield estimation. The development of UAV remote sensing provides a new means [17,18]. Zhu et al. [19] used a UAV remote sensing platform equipped with a multi-spectral camera to obtain the image data of wheat at the jointing stage, heading stage, and filling stage, and constructed nine linear models of different vegetation indexes and measured yields using the least square method. Gong et al. [20] used a multi-spectral camera mounted on a multi-rotor UAV to obtain images of the early flowering period of rapeseed and used a normalized vegetation index to predict yield. Yu et al. [21] developed a dual-camera high-throughput phenotype (HTP) platform on a UAV that collected multispectral data from multiple growth periods to improve the accuracy of soybean yield estimates. At present, the research process of crop canopy spectral image data obtained by using UAV platform equipped with multiple sensors for crop yield estimation has been preliminarily defined, so more studies focus on the selection of the vegetation index and the improvement of the precision of the yield estimation model.
In this study, a multi-rotor UAV platform equipped with a multi-spectral camera was used to obtain canopy spectral data of wheat in multiple growth periods, and a variety of parametric or non-parametric modeling methods were integrated to monitor the main growth indicators (LAI and LDM) and predict grain yield. Therefore, the objectives of this study were: (1) to evaluate the potential of multi-spectral information obtained from a multi-rotor UAV for crop monitoring; (2) to invert the LAI and LDM of a crop and predict the yield by integrating spectral information with agronomic parameters; and (3) to compare and evaluate the estimation accuracy of various parametric regression methods and non-parametric regression methods. The results from this study will provide a reference for further research on UAV monitoring applied to wheat.

2. Materials and Methods

2.1. Experimental Design

In this study, field experiments were carried out in two consecutive years (2018–2019) at the experimental station located in Suining (117°54′E, 33°57′N), Xinghua (119°53′E, 33°05′N), and Kunshan (120°53′E, 31°29′N) (Figure 1), Jiangsu Province, China. The experimental area is located in the Yangtze River Reaches plain, with an average altitude of less than 50 m. Winter wheat is grown at one harvest per year within rice-wheat rotation production system. We selected Yangmai-23 to sow in Xinghua, Yangmai-15 in Kunshan and Xumai-33 in Suining. These wheat varieties are suitable for local growing environment. The treatments involved different N rates and seeding densities. Specific details of the treatments are provided in Table 1. N fertilizer (urea) containing 46% N was applied before sowing and at the stem elongation stage at rate of 50% and 50% of total N.

2.2. Field Data Acquisition

All field data acquisition was synchronized with sampling dates (Table 1).
To measure the LAI, 30 individual wheat plants were selected from each plot and separated by organs (stem, leaf, and spike). The leaf area was measured using a portable li-3000c leaf area meter (Li-Cor., Lincoln, NE, USA), and the LAI of the population was measured by the number of plants and tillers per square meter.
To measure the LDM, 30 individual wheat plants were selected from each plot and separated by organs (stem, leaf, and spike). The green leaves were first placed in an oven at 105 °C for 30 min, and then dried at 80 °C for more than 48 h until a constant weight was obtained. Finally, dry matter was weighed.
To measure the grain yield, at the maturity stage, 1 m2 plants were taken from the unsampled areas of each plot to calculate the number of panicles per unit land area, and 30 plants were taken for seed testing indoors to calculate the number of grains per spike, thousand grain weight, and percentage of seed set. In each plot, plants in an area of 1 m2 were harvested twice for threshing and measuring yield.

2.3. Acquisition of UAV Images

This experiment used a six-rotor UAV (DJI M600Pro, Shenzhen, CHN) with a multi-spectral camera (Airphen, Hiphen, FR) to obtain image data at an altitude of 50 m above the wheat canopy (spatial resolution was 4.7 cm, focal length was 8 mm, heading overlap was 85%, sideways overlap was 90%, flight speed was 2 m/s). The acquisition of UAV images was synchronized with field sampling time (Table 1). We used the software DJI GS PRO (https://www.dji.com/cn/ground-station-pro/) to pre-plan the route and monitor the UAV’s flight. The multi-spectral camera was composed of six channels with a resolution of 1280 × 960 and wavelengths of 450, 530, 675, 730, and 850 nm. Radiometric correction images are taken of standard reflectors on the ground before each flight. The camera is set to take photos automatically, taking photos at an interval of a second, and the image is saved as TIFF format. The flights were conducted in clear, cloudless and calm weather between 11 am and 1 pm.

2.4. Processing of UAV Images

When the UAV is equipped with multi-spectral sensors for crop monitoring, it is vulnerable to adverse weather factors, such as cloud cover, wind, and rain, which will result in the reduction of image data quality and increase the difficulty of later data analysis. Therefore, the preprocessing of UAV images is very important for formal processing and analysis. Common UAV image preprocessing includes radiometric correction, image mosaic, geometric correction, and image cropping.
The experiment adopted the method of calibration boards for radiometric calibration to convert the image value into the image reflectivity through the reflectivity measured by the ground target; this reflected the surface reflectivity in real time [22]. The calibration board was obtained from Hiphen-Plant (http://www.hiphen-plant.com/), which has different reflectivity for different bands. The camera plug-in used was Agisoft Photoscan version 1.4.5 software (https://www.agisoft.com/), which completes the radiometric calibration of multi-spectral images. This step also includes the vignetting correction process, which can effectively eliminate the vignetting phenomenon caused by strong illumination conditions. Since the built-in plug-in of the multi-spectral camera was installed with the help of Agisoft Photoscan software, the image stitching process was also carried out using this software. First, the aerial image folder was imported into Agisoft Photoscan software. Redundant images during take-off and landing were deleted to reduce the processing capacity. The images were then graphically aligned, dense point clouds were built, and mesh and texture were generated. Finally, an orthographic high-resolution canopy image of the experimental field was obtained. To ensure the accuracy of the image, a method based on ground control point (GCP) was adopted for geometric correction. Customized standard plates were placed around the plot as the GCP (Figure 2). A high-precision GPS in the center of the GCP was obtained by real time kinematic (RTK). According to different field size and shape, we arranged different amount of GCP. 18 GCPs were arranged in Xinghua and Kunshan, and 16 GCPs were arranged in Suining. After the above pretreatment, the whole canopy multi-spectral image of the experimental field was obtained. The final precision of the orthomosaic image is 0.3 cm. The crop tool in ENVI software was used in the experiment to retain only part of the test area in the image to eliminate the influence of unnecessary roads and other features around the field. The region of interest (ROI) was manually circled in the plot (Figure 2) and the reflectivity data in the canopy spectral image was accurately extracted.

2.5. Selection of Vegetation Index

After each flight, six multi-spectral image sets with wavelengths of 450, 530, 570, 675, 730, and 850 nm were generated. The six multispectral reflectivities in each sampling period were generated separately by the six multispectral image sets. The experiment extracted reflectivities of six bands from multi-spectral image and fitted vegetation index. The pre-processed reflectivity image was obtained. The reflectivity of each plot was obtained by averaging the total reflectivity of each plot, and then commonly used vegetation indices as described in Table 2 were fitted.

2.6. Modeling Methods and Validation

Quantitative remote sensing is to link agricultural remote sensing information with agricultural target parameters through modeling, which can be divided into three categories: physical model, statistical model, and semi-empirical model [32]. The statistical model is to make empirical statistical descriptions or conduct correlation analyses of a series of observation data and establish a regression model between remote sensing parameters and agricultural observation data. In addition to the commonly used parametric regression methods, various non-parametric regressions have become popular, including stepwise multiple regression (SMR), partial least squares regression (PLSR), artificial neural network (ANN), and support vector machine (SVM).
In this paper, multivariate modeling was used to explore the method of wheat yield estimation. In order to improve its estimation accuracy, we selected different modeling datasets from four perspectives. Six modeling methods including linear regression (LR), multiple linear regression (MLR), stepwise multiple linear regression (SMLR), partial least-squares regression (PLSR), artificial neural network (ANN), and random forest (RF). These modeling methods were used to explore the optimal wheat yield estimation model.
In this study, LR was used to construct the yield estimation model of vegetation indices and test the estimation accuracy. This was calculated as follows:
Y = k X + b
Y is the yield or predicted value (dependent variable) and X is the vegetation index or measured value (independent variable).
MLR is a regression analysis with two or more independent variables. Growth is often related to many factors, so MLR has more practical significance than LR. Based on the existing sample data, this study built an estimation model of MLR. This was calculated as follows:
Y = k 1 X 1 + k 2 X 2 + k 3 X 3 + + k n X n + b
where Y is yield, X 1 ~ X n is vegetation indices, and k 1 ~ k n is the coefficient of the corresponding independent variable.
In the MLR, the multiple correlation of variables will affect the estimation of parameters, leading to the decrease of the estimation accuracy. Therefore, this study also provided a solution for SMLR and PLSR to eliminate collinearity [33]. Typically, SMLR is used to eliminate unnecessary factors by stepwise regression, select significant factors, and obtain the optimal regression model [34]. Firstly, the variable with the maximum sum of squares of regression was selected from the optional variables, and then the variables were selected from the remaining optional variables to form the binary regression equation with the selected variables. This cycle was repeated so that only important variables were retained in the regression equation. The effect of collinearity could be reduced at this step.
PLSR can combine the basic functions of MLR, canonical correlation analysis, and principal component analysis. It can avoid non-normal distribution of data, eliminate multiple linearity between independent variables, and maintain the relationship between independent variables and dependent variables [35,36].
ANN simulates the biological neural network [37], including input layer, hidden layer and output layer [38]. Each layer can be regarded as composed of several logistic regression models, which receive information input from the previous layer and output the estimation results of the model to the next layer. Activation functions exist between the hidden layer and the output layer, mainly including the S function (sigmoid) and the double S function, which was adopted in this study [39]. ANN has strong fault-tolerance and adaptive learning ability, which is very suitable for modeling complex, non-linear systems and can be used for large-scale information processing.
RF is a learning method that integrates multiple decision trees, efficiently processing large-scale information with the ability to obtain a good fit and reduce noise [38,40]. RF uses bootstrap re-sampling to extract some samples from the original sample (N), sampling N times to form a training set, and making an estimation with the unsampled samples. According to the estimation of multiple decision trees, the final estimation result is obtained by voting [41]. In the model, NTREE and MTREE are the main parameter settings. NTREE specifies the number of decision trees contained in the RF, and MTREE specifies the number of variables in the node for the binary tree [38]. NTREE was set to 1000 and MTREE to two after the tuning in this study.
Based on the above experimental design, data acquisition and analysis, a flowchart (Figure 3) was made to better explain the research procedure.
We introduced standard deviation (SD) and coefficient of variation (C.V.) to measure the dispersion degree of the total data. The greater the C.V., the more possibilities the total data contain. SD and C.V. were calculated as follows:
SD   = 1 k 1 i = 1 k ( x i x ¯ ) 2
C . V .   =   S D M e a n
where x ¯ is the mean of total samples and k is the number of samples.
UAV imaging data and agronomic parameters obtained from the experiment were used to establish the wheat growth index and yield estimation model. We chose two indexes, R2 and relative root mean square error (RRMSE), to evaluate the performance of the model. Specifically, R2 and RRMSE were calculated as follows:
R 2 = i = 1 k ( m m ¯ ) 2 ( n n ¯ ) 2 i = 1 k ( m m ¯ ) 2 i = 1 k ( n n ¯ ) 2
RRMSE   =   1 k i = 1 k ( m n ) 2 / n ¯
where m and n are estimated values and measured values, respectively, m ¯ and n ¯ are the average estimated values and measured values, respectively, and k is the number of samples. The method used to test the model in this study was independent data verification.

3. Results

3.1. Variations of Wheat Growth Indices and Yield

The experimental data is based on the whole, and different treatments (nitrogen levels and seeding densities) do not affect the results. In addition, different treatments can make the modeling data contain more possibilities, thus improving the universality of the model. Due to the differences between nitrogen levels and seeding densities, variations in the data were observed (Table 3). LDM in the dataset used for modeling had the largest variation, and its C.V. was 42.42%. LAI and yield had the next greatest variation, with variation coefficients of 37.81% and 30.10%, respectively. The dataset used for validation was similar to the dataset used for modeling, with the maximum variation of LDM, followed by LAI and yield. The C.V. for LDM, LAI, and yield was 45.32%, 40.31%, and 21.72%, respectively. LAI (0.6765–5.3046), LDM (2.2429 t/ha–23.4500 t/ha), and yield (1.3000 t/ha–8.5102 t/ha) showed variation, which covers most possible situations. Therefore, the dataset can support the development of reliable wheat growth index monitoring and yield estimation model.

3.2. LAI Estimation based on UAV Images

The regression relationship between vegetation indices calculated by multi-spectral data and LAI is shown in Figure 4. Different vegetation indices had different monitoring accuracy (R2 = 0.62–0.74). The combination of the red band and the near-infrared band resulted in the best vegetation index, RESAVI, with its R2 reaching 0.74.
According to the results of the model validation (Figure 5), the accuracy of different vegetation indexes is acceptable (R2 = 0.60–0.76). RESAVI had the best accuracy for monitoring and estimation, with its R2 reaching 0.76 and RRMSE reaching 0.1990. NDRE, CIRE, and CCCI also performed well, and their verified R2 were all 0.74 (RRMSE = 0.2096–0.2126), indicating with good estimation accuracy.
Overall, the model with a better monitoring effect also had better estimation accuracy (such as NDRE). The model with a poor monitoring effect also had low accuracy in model validation (such as SAVI). The model constructed by RESAVI for LAI estimation was the optimal model, explaining 74% of the variability and had an R2 of 0.76 and RRMSE of 0.1990.

3.3. LDM Estimation based on UAV Images

Regression relationships between vegetation indices calculated by multi-spectral data and LDM are shown in Figure 6. The accuracy of LDM monitoring constructed using different vegetation indices varies (R2 = 0.46–0.74). The vegetation indices (NDRE, CIRE, CCCI, and RESAVI) of the red edge band and near-infrared band showed better performance (R2 ≥ 0.7). The effect of SAVI monitoring was poor, explaining a variation of 46%. The results were similar to those of LAI. The model constructed with CIRE was optimal, explaining 75% of the variability.
The model built by each vegetation indices verified that R2 ranged from 0.43 to 0.74 (Figure 7). The model with a higher R2 verification had a lower RRMSE, meaning it had better estimation accuracy. Models constructed by NDRE, CIRE, and CCCI fit the data well and had an R2 of 0.74, 0.73, and 0.74, respectively. As above, SAVI estimated the data poorly (R2 = 0.43). NDRE was the optimal estimation model, explaining 74% of variability, with RRMSE of 0.2337.
Combined with the modeling and verification results, the models with better estimation results were all constructed with vegetation indices that utilized a combination of red edge band and near-infrared band images. NDRE had the largest verified R2, so it was the optimal wheat LDM estimation model.

3.4. Yield Estimation based on UAV Images

In this study, a multi-spectral camera mounted on a UAV was used to obtain image data of the wheat canopy at key growth stages. We extracted the reflectivities from the image data and calculated the required vegetation indices (Table 4). The yield predicted by the vegetation index at the tillering stage was poor. The determination coefficient between the vegetation index and yield at the jointing stage ranged from 0.3858 to 0.6328, among which NDVI was the best index, explaining a variation of 63.28%. NDVI at the booting stage (R2 = 0.5949–0.7617) also had the best performance (R2 = 0.7617). NDRE at the flowering stage (R2 = 0.6910–0.7838) had the best performance (R2=0.7838). CCCI at the filling stage (R2 = 0.4057–0.6806) showed the best performance, with R2 = 0.6806. In general, the vegetation indices of the booting stage, flowering stage, and filling stage were well fitted to the data. The yield estimation model of NDRE showed the best results.
Grain yield is affected by multiple factors. Therefore, in addition to the traditional single-factor models, we also constructed multi-factor models to increase the accuracy and stability of the model. Using traditional linear methods may generate problems, such as collinearity, when constructing multi-factor models; therefore, we introduced several commonly used machine-learning methods for yield estimation. Table 4 shows the four methods of acquiring the modeling dataset, and adopt LR, MLR, SMLR, PLSR, ANN and RF for modeling and estimating. The relationships between the measured yield and estimated yield of the different models are shown in Table 5.
The yield estimation model established by NDRE at the flowering stage explained 70% of the variation (RRMSE = 0.1307), meaning that it performed well (Table 5). Table 5 shows that the flowering stage was the best growth stage to build the yield estimation model. Therefore, NDVI (R2 = 0.7661), NDRE (R2 = 0.7838), OSAVI (R2 = 0.7698), and CCCI (R2 = 0.7837), which were the single-factor models with the highest R2 in Table 4, were adopted to build the multi-factor estimation model for wheat yield (Table 5). This experiment randomly selected 72 datasets for modeling and 30 datasets for verification. Five methods including MLR, SMLR, PLSR, ANN, and RF were used to build the wheat yield estimation model. As shown in Table 5, PLSR, ANN, and RF performed better. The validation results of MLR and SMLR were similar (the factors of NDVI were excluded from SMLR modeling). The RRMSE of these five methods was maintained at a reasonable range of 0.1126–0.1353. In the yield estimation model, ANN was the optimal yield estimation model, explaining a variability of 77.01% with an RRMSE of 0.1126.
The overall performance of NDVI was the best; therefore, NDVI at jointing stage (R2 = 0.6328), booting stage (R2 = 0.7617), flowering stage (R2 = 0.7661), and filling stage (R2 = 0.5692) were used to construct the wheat yield estimation model. Modeling verification was carried out based on the same dataset mentioned above, and five modeling methods including MLR, SMLR, PLSR, ANN, and RF were also adopted for the comparative analysis (Table 5). The estimation accuracy of MLR and SMLR was similar to each other (factors were not excluded from SMLR modeling), and the results were similar to the multiple vegetation indices at a single growth stage. PLSR, ANN, and RF performed better than MLR and SMLR (R2 = 0.7454–0.78). The RRMSE of different methods was maintained at a reasonable range of 0.1030–0.1343. RF was the optimal yield estimation model in this method, which explained 78% of the variation and the RRMSE was 0.1030.
As Table 5 shows, NDVI had the best performance at the jointing stage, explaining a variability of 63.28%. NDVI showed the best performance at the booting stage, explaining 76.17% of variability. NDRE showed the best performance at flowering stage, explaining a variability of 78.38%. CCCI was the best vegetation index at the filling stage, explaining the variability of 68.06%. In terms of multiple vegetation indices at multiple growth stages, we chose NDVI at the jointing stage, NDVI at the booting stage, NDRE at the flowering stage, and CCCI at the filling stage to build an overall model of wheat yield estimation. Modeling validation was based on the same dataset and modeling method mentioned above (Table 5). The estimation accuracy of SMLR was quite different from that of MLR (NDVI at the booting stage and CCCI at the filling stage were excluded during SMLR modeling). PLSR, ANN, and RF still performed well, explaining a variation of 75.82% to 76.67%. The PLSR was the optimal yield estimation model because it had the largest R2 in this method.

4. Discussion

Plants have spectral characteristics and can absorb, reflect, and radiate different spectra. UAV remote sensing technology used for agriculture can detect these spectral characteristics of plants. Light of different wavelengths has different effects on plant growth [42]. Image sensors mounted on UAVs are used to collect images of crops in different bands and extract different features [43,44]. The multi-spectral sensor we used (i.e., Airphen) is an imaging sensor. Unlike non-imaging sensors, such as GreenSeeker or RapidSCAN, Airphen collects image data from more spectral bands (6 bands) with a good spatial resolution (4.7 cm). Compared with hyperspectral sensors, Airphen is easier to operate and more convenient for data processing. In addition, Airphen is light enough to fit a UAV. We used a customized gimbal to stably connect the sensor to the multi-rotor UAV so that it can the UAV can fly more steadily and take clearer images. For small-scale crop monitoring tasks, multi-rotor UAVs have the advantages of low take-off and landing requirements, low cost, high flexibility, and high resolution compared with fixed-wing UAVs. Unlike high-altitude (satellite, aerospace) remote sensing, UAV remote sensing is convenient and can adjust through time and space as needed. In crop-scale monitoring, ground remote sensing is time-consuming and laborious with low efficiency. A comparison of different sensors and remote sensing platforms is shown in Table 6.
LAI and LDM are the primary growth parameters of crops and they are closely related to yield formation. The traditional test method is destructive sampling, which it is time-consuming and laborious, and may suffer from human error. By means of remote sensing, lossless estimation of LAI and LDM can be achieved, which greatly improves the efficiency and prepares for the following yield estimation. In this study, the correlation between vegetation indices and growth parameters during the whole growth season of wheat was constructed. During the same growth stage, the difference among N rates (0–300 kg·ha−1) and seeding densities (1.2–3.5 million seedlings ha−1) resulted in the synchronous change of growth parameters and vegetation indices. Vegetation indices can represent the growth status of different treatments to some extent, so they can be used to non-destructively estimate crop growth parameters. We adopted a simple regression method to analyze this single factor model. SR is one of the common modeling methods related to the correlation between vegetation indices and growth parameters. When building the relationship between vegetation indices and LAI, GNDVI and NDVI appeared to show a ‘saturation’ phenomenon [48], resulting in a poor fit with the dataset. The vegetation indices of the red edge band and near-infrared band (NDRE, CIRE, CCCI, and RESAVI) performed better, explaining a variability of 71%, 71%, 70%, and 74%, respectively. The LAI estimation model constructed by RESAVI was the optimal model in this study (Table 7). The results of the LDM estimation models are similar to that of LAI estimation models. Better vegetation indices (NDRE, CIRE, and CCCI) were also composed of a combination of red edge band and near-infrared band data, explaining 74%, 73%, and 74% of variation, respectively. The LDM estimation model constructed by CIRE was the optimal model in this study (Table 7). This is consistent with previous research results [49]. This may be because the red edge band and the near-infrared band are more sensitive and can better characterize the canopy growth dynamics.
At present, the conventional method of wheat yield estimation by remote sensing is an empirical model, including a linear model and nonlinear model. Linear models are simple to calculate, but the formation of wheat yield is usually non-linear [50]. In this study, LR and MLR are linear models, but we also utilized nonlinear models, including SMLR, PLSR, ANN, and RF; among these nonlinear models, ANN and RF are machine-learning tools that have be developed more recently. In general, the nonlinear estimation model is better than the general linear estimation model in this study. It is possible that the linear method often has a problem of strong empirical characteristics and low accuracy [51]. The correlation between the vegetation index and yield from the booting stage to the filling stage was better than that at the early growth stage, and this is consistent with the research results of Zhu et al. [19]. Since the time of photographing and sampling was at the late filling stage, rather than at the early or middle stage, the correlation coefficient at the filling stage decreased. Wheat leaves began to age gradually at the late filling stage and were no longer suitable for modeling and analysis.
Considering the single factor to construct the yield estimation model, we selected the NDRE at the flowering stage, which performed best throughout the entire season, to establish the model using a simple regression method. We found that NDRE at the flowering stage was significantly correlated with yield. We can conclude that it is feasible to estimate wheat yield with sensitive vegetation indices, which is consistent with previous studies [17]. While considering the multiple factors used to construct the yield estimation model, we selected four vegetation indices with the best performance from different perspectives to establish the model by multiple multivariate modeling methods. In this study, the estimation accuracy of multi-factor estimation models was much higher than that of single-factor models. The multi-factor yield estimation model with the lowest R2 value was the model constructed by NDVI at the jointing stage, NDVI at the booting stage, NDVI at the flowering stage, and NDVI at the filling stage. The methods used in this model with the lowest R2 were MLR and SMLR, and they performed almost identically. Similar results were obtained in the model constructed by NDVI, NDRE, OSAVI, and CCCI at the flowering stage. However, in the model constructed by NDVI at the jointing stage, NDVI at the booting stage, NDRE at the flowering stage, and CCCI at the filling stage, the MLR method was superior to SMLR. The difference between MLR and SMLR mainly depend on the number of exclusion factors in the modeling process of SMLR. Overall, the estimation accuracy of PLSR, ANN, and RF was better than MLR and SMLR. In the model constructed by NDVI at the jointing stage, NDVI at the booting stage, NDRE at the flowering stage, and CCCI at the filling stage, PLSR was the optimal the method with the highest R2 (0.7667). ANN had the best performance in the model constructed by NDVI at the flowering stage, NDRE at the flowering stage, OSAVI at the flowering stage, and CCCI at the flowering stage, explaining 77.01% of the variation. In this paper, the optimal yield estimation model was constructed by NDVI at the jointing stage, NDVI at the booting stage, NDVI at the flowering stage, and NDVI at the filling stage, which adopted the RF method, explaining a variability of 78% (Table 7). Machine-learning methods (PLSR, ANN, and RF) have advantages for doing regression with non-linear correlation by avoiding multi-collinearity and eliminating interference factors, thus further improving the accuracy of the yield estimation model.
When establishing the single factor model, we only adopted the simple regression method because the single-factor model did not have the problem of multicollinearity or others, which appeared in multi-factor models. Different modeling methods may marginally improve model accuracy. For example, PLSR had a very similar result as the simple linear regression. In the exploration of the modeling methods of a multi-factor yield estimation model, some commonly used methods were adopted, including MLR, SMLR, PLSR, ANN, and RF. Recently, some popular modeling methods are expected to be tried in future further studies, such as support vector machine (SVM). A limitation of this study is the specific locality, using experimental results from Jiangsu Province, meaning that without the verification of different ecological points and further experiments, there may be a lack of universality in the findings.

5. Conclusions

This study explored the potential of multi-spectral camera mounted on a multi-rotor UAV for monitoring wheat growth indices. The results showed that the vegetation index composed of a red edge band and near-infrared band was significantly correlated with LAI and LDM. The optimal LAI estimation model was built by RESAVI, explaining 76% variation, with an RRMSE of 0.1990. The optimal LDM estimation model was built by NDRE, explaining 74% variation, with an RRMSE of 0.2337. We also analyzed and evaluated the yield estimation accuracy of LR, MLR, SMLR, PLSR, ANN, and RF. The results showed that the yield estimation model built by multiple factors is superior to the single factor, and the yield estimation model built by PLSR, ANN, and RF performed better than other models. In this experiment, the optimal yield estimation model was built by NDVI at the jointing stage, NDVI at the booting stage, NDVI at the flowering stage, and NDVI at the filling stage. The modeling method we used was RF, explaining 78% variation, with RRMSE of 0.1030. In summary, this study demonstrates the potential of using a multi-rotor UAV combined with the multi-spectral camera to monitor wheat growth parameters and estimate yield. A variety of linear and nonlinear modeling methods were used to explore the possibility of further improving the accuracy of yield estimation. The UAV sensing system in this study can provide reference and technical support for the management and decision-making of intensive farming at medium-scale cropping area, which performed more efficient than handheld sensors. This method should combine with satellite data to enlarge the application at large-scale agricultural area in future study.

Author Contributions

Z.F., X.L., and Y.Z. conceived and designed the experiments; Z.F., Y.G., and X.L. performed the experiments with support from M.W. and K.Z. Z.F. analyzed the data and wrote the original manuscript; X.L. and Y.Z. provided advice and revised the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

The work was funded by the National Key Research and Development Program of China (2016YFD0200602; 2016YFD0300604), the Fundamental Research Funds for the Central Universities (KYZ201602), the earmarked fund for Jiangsu Agricultural Industry Technology System (JATS(2018)290; JATS(2018)082), the 111 project (B16026), and Jiangsu Province Key Technologies R&D Program (BE2016375).

Acknowledgments

We would like to thank Jiayi Zhang, Yanyu Wang, Songyang Li, and Ke Zhang for their help and valuable advice. We would also like to thank Yang Gao, Xinge Li, Meng Wang, and Kaitai Zhong for their help with sample processing.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, L.; Liu, J.; Yang, L.; Chen, Z.; Wang, X.; Ouyang, B. Applications of unmanned aerial vehicle images on agricultural remote sensing monitoring. Trans. Chin. Soc. Agric. Eng. 2013, 29, 136–145. [Google Scholar]
  2. LI, D.; LI, M. Research Advance and Application Prospect of Unmanned Aerial Vehicle Remote Sensing System. Geomat. Inf. Sci. Wuhan Univ. 2014, 39, 505–513. [Google Scholar]
  3. Li, B.; Liu, R.; Liu, S.; Liu, Q.; Liu, F.; Zhou, G. Monitoring vegetation coverage variation of winter wheat by low-altitude UAV remote sensing system. Trans. Chin. Soc. Agric. Eng. 2012, 28, 160–165. [Google Scholar]
  4. Chen, Q.; Zhang, Z.; Liu, P.; Wang, X.; Jiang, F. Monitoring of Growth Parameters of Sweet Corn Using CGMD302 Spectrometer. Agric. Sci. Technol. 2015, 16, 364–368. [Google Scholar]
  5. Zhang, J.; Liu, X.; Liang, Y.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Using a Portable Active Sensor to Monitor Growth Parameters and Predict Grain Yield of Winter Wheat. Sensors 2019, 19, 1108. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Zhang, N.; Qi, B.; Zhao, J.; Zhang, X.; Wang, S.; Zhao, T.; Gai, J. Prediction for Soybean Grain Yield Using Active Sensor GreenSeeker. Acta Agron. Sin. 2014, 40, 657–666. [Google Scholar] [CrossRef]
  7. Yang, G.; Li, C.; Yu, H.; Xu, B.; Feng, H.; Gao, L.; Zhu, D. UAV based multi-load remote sensing technologies for wheat breeding information acquiremen. Trans. Chin. Soc. Agric. Eng. 2015, 31, 184–190. [Google Scholar]
  8. Tian, M.; Ban, S.; Chang, Q.; You, M.; Luo, D.; Wang, L.; Wang, S. Use of hyperspectral images from UAV-based imaging spectroradiometer to estimate cotton leaf area index. Trans. Chin. Soc. Agric. Eng. 2016, 32, 102–108. [Google Scholar]
  9. Zhao, X.; Yang, G.; Liu, J.; Zhang, X.; Xu, B.; Wang, Y.; Zhao, C.; Gai, J. Estimation of soybean breeding yield based on optimization of spatial scale of UAV hyperspectral image. Trans. Chin. Soc. Agric. Eng. 2017, 33, 110–116. [Google Scholar]
  10. Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods. 2019, 15, 10. [Google Scholar] [CrossRef] [Green Version]
  11. Zhou, X.; Zheng, H.; Xu, X.; He, J.; Ge, X.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.; Tian, Y. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  12. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  13. Ata-Ul-Karim, S.; Zhu, Y.; Yao, X.; Cao, W. Determination of critical nitrogen dilution curve based on leaf area index in rice. Field Crop. Res. 2014, 167, 76–85. [Google Scholar] [CrossRef]
  14. Córcoles, J.; Ortega, J.; Hernández, D.; Moreno, M. Estimation of leaf area index in onion (Allium cepa L.) using an unmanned aerial vehicle. Biosyst. Eng. 2013, 115, 31–42. [Google Scholar]
  15. Gao, L.; Yang, G.; Yu, H.; Xu, B.; Zhao, X.; Dong, J.; Ma, Y. Retrieving winter wheat leaf area index based on unmanned aerial vehicle hyperspectral remoter sensing. Trans. Chin. Soc. Agric. Eng. 2016, 32, 113–120. [Google Scholar]
  16. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  17. Tan, C.; Du, Y.; Tong, L.; Zhou, J.; Luo, M.; Yan, W.; Chen, F. Comparison of the Methods for Predicting Wheat Yield Based on Satellite Remote Sensing Data at Anthesis. Sci. Agric. Sin. 2017, 50, 3101–3109. [Google Scholar]
  18. Chen, Z.; Ren, J.; Tang, H.; Shi, Y.; Leng, P.; Liu, J.; Wang, L.; Wu, W.; Yao, Y.; Hasiyuya. Progress and perspectives on agricultural remote sensing research and applications in China. J. Remote Sens. 2016, 20, 748–767. [Google Scholar]
  19. Zhu, W.; Li, S.; Zhang, X.; Li, Y.; Sun, Z. Estimation of winter wheat yield using optimal vegetation indices from unmanned aerial vehicle remote sensing. Trans. Chin. Soc. Agric. Eng. 2018, 34, 78–86. [Google Scholar]
  20. Gong, Y.; Duan, B.; Fang, S.; Zhu, R.; Wu, X.; Ma, Y.; Peng, Y. Remote estimation of rapeseed yield with unmanned aerial vehicle (UAV) imaging and spectral mixture analysis. Plant Methods. 2018, 14, 70. [Google Scholar] [CrossRef]
  21. Yu, N.; Li, L.; Schmitz, N.; Tian, L.; Greenberg, J.; Diers, B. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform. Remote Sens. Environ. 2016, 187, 91–101. [Google Scholar] [CrossRef]
  22. Wang, P.; Zhang, J.; Lan, Y.; Zhou, Z.; Luo, X. Radiometric calibration of low altitude multispectral remote sensing images. Trans. Chin. Soc. Agric. Eng. 2014, 30, 199–206. [Google Scholar]
  23. Taddeo, S.; Dronova, I.; Depsky, N. Spectral vegetation indices of wetland greenness: Responses to vegetation structure, composition, and spatial distribution. Remote Sens. Environ. 2019, 234, 111467. [Google Scholar] [CrossRef]
  24. Qi, J.; Kerr, Y.; Moran, M.; Weltz, M.; Huete, A.; Sorooshian, S.; Bryant, R. Leaf Area Index Estimates Using Remotely Sensed Data and BRDF Models in a Semiarid Region. Remote Sens. Environ. 2000, 73, 18–30. [Google Scholar] [CrossRef] [Green Version]
  25. Fitzgerald, G.; Rodriguez, D.; Christensen, L.; Belford, R.; Sadras, V.; Clarke, T. Spectral and thermal sensing for nitrogen and water status in rainfed and irrigated wheat environments. Precis. Agric. 2006, 7, 223–248. [Google Scholar] [CrossRef]
  26. Dong, L.; Qingsong, G.; Wenjiang, H.; Linsheng, H.; Guijun, Y. Remote sensing inversion of leaf area index based on support vector machine regression in winter wheat. Trans. Chin. Soc. Agric. Eng. 2013, 29, 117–123. [Google Scholar]
  27. Wu, Y.; He, L.; Wang, Y.; Liu, B.; Wang, Y.; Guo, T.; Feng, W. Dynamic model of vegetation indices for biomass and nitrogen accumulation in winter wheat. Acta Agron. Sin. 2019, 45, 1238–1249. [Google Scholar]
  28. Steven, M. The Sensitivity of the OSAVI Vegetation Index to Observational Parameters. Remote Sens. Environ. 1998, 63, 49–60. [Google Scholar] [CrossRef]
  29. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  30. Li, F.; Mistele, B.; Hu, Y.; Yue, X.; Yue, S.; Miao, Y.; Chen, X.; Cui, Z.; Meng, Q.; Schmidhalter, U. Remotely estimating aerial N status of phenologically differing winter wheat cultivars grown in contrasting climatic and geographic zones in China and Germany. Field Crop. Res. 2012, 138, 21–32. [Google Scholar] [CrossRef]
  31. Sripada, R.; Heiniger, R.; White, J.; Meijer, A. Aerial Color Infrared Photography for Determining Early In-Season Nitrogen Requirements in Corn. Agron. J. 2006, 98, 968. [Google Scholar] [CrossRef]
  32. Wigneron, J.; Jackson, T.; O’Neill, P.; De Lannoy, G.; de Rosnay, P.; Walker, J.; Ferrazzoli, P.; Mironov, V.; Bircher, S.; Grant, J.; et al. Modelling the passive microwave signature from land surfaces: A review of recent results and application to the L-band SMOS & SMAP soil moisture retrieval algorithms. Remote Sens. Environ. 2017, 192, 238–262. [Google Scholar]
  33. Chen, Y.; Ma, W.; Wang, X.; Zhao, C. Relationship between Soil Nutrient and Wheat Yield Based on PLS. Trans. Chin. Soc. Agric. Mach. 2012, 43, 159–164. [Google Scholar]
  34. Yu, X.; Liu, Q.; Wang, Y.; Liu, X.; Liu, X. Evaluation of MLSR and PLSR for estimating soil element contents using visible/near-infrared spectroscopy in apple orchards on the Jiaodong peninsula. Catena 2016, 137, 340–349. [Google Scholar] [CrossRef]
  35. Kasim, N.; Shi, Q.; Wang, J.; Sawut, R.; Nurmemet, I.; Isak, G. Estimation of spring wheat chlorophyll content based on hyperspectral features and PLSR model. Trans. Chin. Soc. Agric. Eng. 2017, 33, 208–216. [Google Scholar]
  36. Wold, S.; Sjöström, M.; Eriksson, L. PLS-regression: a basic tool of chemometrics. Chemometr. Intell. Lab. 2001, 58, 109–130. [Google Scholar] [CrossRef]
  37. Rossel, R.; Behrens, T. Using data mining to model and interpret soil diffuse reflectance spectra. Geoderma 2010, 158, 46–54. [Google Scholar] [CrossRef]
  38. Wang, Y.; Zhang, K.; Tang, C.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Estimation of Rice Growth Parameters Based on Linear Mixed-Effect Model Using Multispectral Images from Fixed-Wing Unmanned Aerial Vehicles. Remote Sens. 2019, 11, 1371. [Google Scholar] [CrossRef] [Green Version]
  39. Were, K.; Bui, D.; Dick, Ø.; Singh, B. A comparative assessment of support vector regression, artificial neural networks, and random forests for predicting and mapping soil organic carbon stocks across an Afromontane landscape. Ecol. Indic. 2015, 52, 394–403. [Google Scholar] [CrossRef]
  40. Zhang, C.; Yang, G.; Li, H.; Tang, F.; Liu, C.; Zhang, L. Remote Sensing Inversion of Leaf Area Index of Winter Wheat Based on Random Forest Algorithm. Sci. Agric. Sin. 2018, 51, 855–867. [Google Scholar]
  41. He, T.; Xie, C.; Liu, Q.; Guan, S.; Liu, G. Evaluation and Comparison of Random Forest and A-LSTM Networks for Large-scale Winter Wheat Identification. Remote Sens. 2019, 11, 1665. [Google Scholar] [CrossRef] [Green Version]
  42. Duan, B.; Liu, Y.; Gong, Y.; Peng, Y.; Wu, X.; Zhu, R.; Fang, S. Remote estimation of rice LAI based on Fourier spectrum texture from UAV image. Plant Methods. 2019, 15. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. Zhu, W.; Sun, Z.; Huang, Y.; Lai, J.; Li, J.; Zhang, J.; Yang, B.; Li, B.; Li, S.; Zhu, K.; et al. Improving Field-Scale Wheat LAI Retrieval Based on UAV Remote-Sensing Observations and Optimized VI-LUTs. Remote Sens. 2019, 11, 2456. [Google Scholar] [CrossRef] [Green Version]
  44. Li, S.; Yuan, F.; Ata-UI-Karim, S.T.; Zheng, H.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Combining Color Indices and Textures of UAV-Based Digital Imagery for Rice LAI Estimation. Remote Sens. 2019, 11, 1763. [Google Scholar] [CrossRef] [Green Version]
  45. Guo, W.; Zhu, Y.; Wang, H.; Zhang, J.; Dong, P.; Qiao, H. Monitoring Model of Winter Wheat Take-all Based on UAV Hyperspectral Imaging. Trans. Chin. Soc. Agric. Mach. 2019, 50, 162–169. [Google Scholar]
  46. Zarco-Tejada, P.; Diaz-Varela, R.; Angileri, V.; Loudjani, P. Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods. Eur. J. Agron. 2014, 55, 89–99. [Google Scholar] [CrossRef]
  47. Jing, Y.; Li, G.; Huang, W. Estimation of double cropping rice planting area using similar index and linear spectral mixture model. Trans. Chin. Soc. Agric. Eng. 2013, 29, 177–183. [Google Scholar]
  48. Goswami, S.; Gamon, J.; Vargas, S.; Tweedie, C. Relationships of NDVI, Biomass, and Leaf Area Index (LAI) for six key plant species in Barrow, Alaska. PeerJ. 2015, 3, e911v–e913v. [Google Scholar]
  49. Knipling, E.B. Physical and physiological basis for the reflectance of visible and near-infrared radiation from vegetation. Remote Sens. Environ. 1970, 1, 155–159. [Google Scholar] [CrossRef]
  50. Li, R.; Li, C.; Xu, X.; Wang, J.; Yang, X.; Huang, W.; Pan, Y. Winter wheat yield estimation based on support vector machine regression and multi-temporal remote sensing data. Trans. Chin. Soc. Agric. Eng. 2009, 25, 114–117. [Google Scholar]
  51. Groten, S. NDVI—Crop monitoring and early yield assessment of Burkina Faso. Int. J. Remote Sens. 1993, 14, 1495–1515. [Google Scholar] [CrossRef]
Figure 1. A map illustrating the study site. Wheat experiments were carried out at the Suining, Xinghua, and Kunshan Experimental Stations in Jiangsu Province, China in 2018–2019.
Figure 1. A map illustrating the study site. Wheat experiments were carried out at the Suining, Xinghua, and Kunshan Experimental Stations in Jiangsu Province, China in 2018–2019.
Remotesensing 12 00508 g001
Figure 2. Region of interest and one of the ground control points.
Figure 2. Region of interest and one of the ground control points.
Remotesensing 12 00508 g002
Figure 3. The flowchart of this study.
Figure 3. The flowchart of this study.
Remotesensing 12 00508 g003
Figure 4. Leaf area index plotted against vegetation indices for the whole growth season of wheat: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Figure 4. Leaf area index plotted against vegetation indices for the whole growth season of wheat: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Remotesensing 12 00508 g004
Figure 5. Validation of leaf area index (LAI) estimation models for the entire growth season of wheat: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Figure 5. Validation of leaf area index (LAI) estimation models for the entire growth season of wheat: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Remotesensing 12 00508 g005
Figure 6. Leaf dry matter (t ha−1) plotted against vegetation indices for the whole season: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Figure 6. Leaf dry matter (t ha−1) plotted against vegetation indices for the whole season: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Remotesensing 12 00508 g006
Figure 7. Validation of leaf dry matter (LDM) estimation models for the entire growth season of wheat: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Figure 7. Validation of leaf dry matter (LDM) estimation models for the entire growth season of wheat: (a) GNDVI; (b) NDVI; (c) NDRE; (d) RVI; (e) CIRE; (f) OSAVI; (g) SAVI; (h) CCCI; (i) RESAVI.
Remotesensing 12 00508 g007
Table 1. Basic information about experimental design and data acquisition.
Table 1. Basic information about experimental design and data acquisition.
Experiment and PlaceCultivars and SeasonNitrogen Rates (kg ha−1)Seeding Densities (Million Seedlings ha1)Plot SizeSampling Stages
Exp.1Yangmai-23N0 (0)D1 (1.2)30 m2Tillering (6 March 2019)
Xinghua2018–2019N1 (180)D2 (1.8)5 m × 6 mJointing (12 March 2019)
N2 (240)D3 (2.4)90 plotsBooting (4 April 2019)
N3 (300) Flowering (20 Apr 2019)
Filling (9 May 2019)
Exp.2Yangmai-15N0 (0)D1 (1.2)24.5 m2Tillering (4 March 2019)
Kunshan2018–2019N1 (180)D2 (1.8)5.5 m × 4.5 mJointing (14 March 2019)
N2 (270)D3 (2.4)84 plotsBooting (30 March 2019)
Flowering (17 April 2019)
Filling (5 May 2019)
Exp.3Xumai-33N0 (0)D1 (1.5)20 m2Tillering (7 March 2019)
Suining2018–2019N1 (180)D2 (2.5)4 m × 5 mJointing (13 March 2019)
N2 (240)D3 (3.5)108 plotsBooting (8 April 2019)
N3 (300) Flowering (24 April 2019)
Filling (15 May 2019)
Table 2. Selected vegetation used for leaf area index, leaf dry matter, and yield estimation.
Table 2. Selected vegetation used for leaf area index, leaf dry matter, and yield estimation.
Vegetation IndexFormulationReference
GNDVI(NIR − G)/(NIR + G)[23]
NDVI(NIR − R)/(NIR + R)[24]
NDRE(NIR − RE)/(NIR + RE)[25]
RVINIR/R[26]
CIRE(NIR/RE) − 1[27]
OSAVI(NIR − R)/(NIR + R + 0.16)[28]
SAVI(1 + L)*(NIR − R)/(NIR + R + L)[29]
CCCI(NDRE − NDREmin)/(NDREmax − NDREmin)[30]
RESAVI1.5*(NIR − RE)/(NIR + RE + 0.5)[31]
Table 3. Descriptive statistics of wheat leaf area index (LAI), leaf dry matter (LDM), and yield of different cultivars and nitrogen levels.
Table 3. Descriptive statistics of wheat leaf area index (LAI), leaf dry matter (LDM), and yield of different cultivars and nitrogen levels.
IndicatorsSample NumberMinMaxMeanSDC.V. (%)
Modeled dataset
LAI2100.75244.93922.57070.972137.81
LDM (t/ha)2102.622423.450011.09544.706542.42
Yield (t/ha)721.30008.50006.00111.806530.10
Validated dataset
LAI900.67655.30462.61121.052740.31
LDM (t/ha)902.242921.840011.16065.058345.32
Yield (t/ha)303.09208.51026.46651.404421.72
Note: Min is the minimum, Max is the maximum, Mean is the average value, SD is the standard deviation, and C.V. is the coefficient of variation.
Table 4. Determination coefficients between grain yield and different vegetation indices.
Table 4. Determination coefficients between grain yield and different vegetation indices.
Vegetation IndexTillering StageJointing StageBooting StageFlowering StageFilling Stage
GNDVI (850,570)0.1139 **0.5378 **0.7199 **0.7422 **0.4057 **
NDVI (850,675)0.1491 **0.6328 **0.7617 **0.7661 **0.5692 **
NDRE (850,730)0.1757 **0.4234 **0.6841 **0.7838 **0.6804 **
RVI (850,675)0.1433 **0.4073 **0.5949 **0.6910 **0.5394 **
CIRE (850,730)0.1768 **0.3858 **0.6251 **0.7455 **0.6614 **
OSAVI (850,675)0.1494 **0.5770 **0.7339 **0.7698 **0.5384 **
SAVI (850,675)0.1486 **0.5284 **0.6779 **0.7398 **0.5045 **
CCCI (850,730)0.1757 **0.4234 **0.6841 **0.7837 **0.6806 **
RESAVI (850,730)0.1704 **0.3882 **0.6279 **0.7527 **0.6622 **
Note: ** indicates correlation significant at 0.01 level, the bold indicates the best correlation.
Table 5. Validation of yield estimation models with simple linear regression (LR), multiple linear regression (MLR), stepwise multiple linear regression (SMLR), artificial neural network (ANN), and random forest (RF).
Table 5. Validation of yield estimation models with simple linear regression (LR), multiple linear regression (MLR), stepwise multiple linear regression (SMLR), artificial neural network (ANN), and random forest (RF).
Vegetation Indices for ModelingLRMLRSMLRPLSRANNRF
NDRE (Flowering)0.7000 a
0.1307 b
\\\\\
NDVI (Flowering), NDRE (Flowering), OSAVI (Flowering), CCCI (Flowering)\0.7490 a
0.1142 b
0.7490 a
0.1142 b
0.7542 a
0.1353 b
0.7701 a
0.1126 b
0.7606 a
0.1149 b
NDVI (Jointing), NDVI (Booting), NDVI (Flowering), NDVI (Filling)\0.7186 a
0.1163 b
0.7186 a
0.1163 b
0.7571 a
0.1343 b
0.7454 a
0.1100 b
0.7800 a
0.1030 b
NDVI (Jointing), NDVI (Booting), NDRE (Flowering), CCCI (Filling)\0.7572 a
0.1113 b
0.7308 a
0.1197 b
0.7667 a
0.1353 b
0.7582 a
0.1132 b
0.7602 a
0.1165 b
Note: a indicates the value of determination coefficient R2, b indicates the value of relative root mean-squared error (RRMSE). NDVI (Jointing) means normalized difference vegetation at the jointing stage, NDVI (Booting) means normalized difference vegetation at the booting stage, NDVI (Flowering) means normalized difference vegetation at the flowering stage, NDVI (Filling) means normalized difference vegetation at the filling stage, NDRE (Flowering) means normalized difference red edge at the flowering stage, OSAVI (Flowering) means optimized soil adjusted vegetation index at the flowering stage, CCCI (Flowering) means canopy chlorophyll content index at the flowering stage, and CCCI (Filling) means canopy chlorophyll content index at the filling stage.
Table 6. Parameters comparison for different sensors and platforms.
Table 6. Parameters comparison for different sensors and platforms.
NameCategoryContentExampleParameters contrastReference
SensorsImaging sensorHyperspectral cameraCuber UHD185 Firefly imaging spectrometer of UAV125 bands (450–950 nm), sensor resolution: 2 million pixels[45]
Multispectral cameraAirphen6 bands (450 nm, 530 nm, 570 nm, 675 nm, 730 nm, 850 nm), 1280 × 960 pixelsThis paper
Non-imaging sensor\GreenSeeker2 bands (656 nm, 774 nm), handheld[6]
PlatformsGround\Analytical spectral devices (ASD)Wavelength range: 325–1075 nm, spectral resolution: 3 nm[45]
Low-altitudeFixed-wing UAV FW IEndurance: 60 min, maximum speed: 17.5 m/s[46]
Multi-rotor UAV DJI M600ProEndurance: 38 min, maximum speed: 18 m/sThis paper
High-altitudeSatelliteMODISSpatial Resolution: 250 m (bands 1–2), 500 m (bands 3–7), 1000 m (bands 8–36)[47]
Table 7. The optimal estimation model of LAI, LDM, and yield in this study.
Table 7. The optimal estimation model of LAI, LDM, and yield in this study.
The Optimal Estimation ModelVegetation IndexModeling MethodModeling R2Verified R2RRMSE
LAIRESAVI (All stages)LR0.740.740.1990
LDMCIRE (All stages)LR0.750.750.2372
YieldNDVI (Jointing) NDVI (Booting)
NDVI (Flowering) NDVI (Filling)
RF0.760.780.1030

Share and Cite

MDPI and ACS Style

Fu, Z.; Jiang, J.; Gao, Y.; Krienke, B.; Wang, M.; Zhong, K.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle. Remote Sens. 2020, 12, 508. https://0-doi-org.brum.beds.ac.uk/10.3390/rs12030508

AMA Style

Fu Z, Jiang J, Gao Y, Krienke B, Wang M, Zhong K, Cao Q, Tian Y, Zhu Y, Cao W, et al. Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle. Remote Sensing. 2020; 12(3):508. https://0-doi-org.brum.beds.ac.uk/10.3390/rs12030508

Chicago/Turabian Style

Fu, Zhaopeng, Jie Jiang, Yang Gao, Brian Krienke, Meng Wang, Kaitai Zhong, Qiang Cao, Yongchao Tian, Yan Zhu, Weixing Cao, and et al. 2020. "Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle" Remote Sensing 12, no. 3: 508. https://0-doi-org.brum.beds.ac.uk/10.3390/rs12030508

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop