Next Article in Journal
Fright or Flight? Behavioural Responses of Kangaroos to Drone-Based Monitoring
Next Article in Special Issue
Using Unmanned Aerial Systems (UAS) and Object-Based Image Analysis (OBIA) for Measuring Plant-Soil Feedback Effects on Crop Productivity
Previous Article in Journal
Using Fixed-Wing UAV for Detecting and Mapping the Distribution and Abundance of Penguins on the South Shetlands Islands, Antarctica
Previous Article in Special Issue
Assessment of Texture Features for Bermudagrass (Cynodon dactylon) Detection in Sugarcane Plantations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses

by
Jayme Garcia Arnal Barbedo
Embrapa Agricultural Informatics, Campinas-SP 13083-886, Brazil
Submission received: 29 March 2019 / Revised: 12 April 2019 / Accepted: 17 April 2019 / Published: 20 April 2019
(This article belongs to the Special Issue UAV/Drones for Agriculture and Forestry)

Abstract

:
Unmanned aerial vehicles (UAVs) are becoming a valuable tool to collect data in a variety of contexts. Their use in agriculture is particularly suitable, as those areas are often vast, making ground scouting difficult, and sparsely populated, which means that injury and privacy risks are not as important as in urban settings. Indeed, the use of UAVs for monitoring and assessing crops, orchards, and forests has been growing steadily during the last decade, especially for the management of stresses such as water, diseases, nutrition deficiencies, and pests. This article presents a critical overview of the main advancements on the subject, focusing on the strategies that have been used to extract the information contained in the images captured during the flights. Based on the information found in more than 100 published articles and on our own research, a discussion is provided regarding the challenges that have already been overcome and the main research gaps that still remain, together with some suggestions for future research.

1. Introduction

Unmanned aerial vehicles (UAVs), also known as unmanned aerial systems (UAS) and drones, have been used in agricultural applications for some time, especially in the context of precision agriculture [1,2,3,4], but in recent years, there has been a steep growth in their adoption. In countries like the United States, agricultural applications already account for 19% of the whole UAV market [5]. Thus, UAVs are quickly becoming a valuable decision support tool for farmers and researchers dealing with agricultural problems.
There are a few reasons for the increased use of UAVs. First, UAV prices have been steadily decreasing, and most UAVs capable of handling agricultural applications now cost less than US$10,000 [4,5]. Although there are other costs associated with the operation of UAVs, including maintenance, insurance, training, image processing software, and navigation software [6,7], the monetary investment is quickly becoming less relevant in comparison to the potential benefits. Second, while regulations are still overly strict in some countries [8], these are slowly changing towards a better balance between safety and usability [4]. This is particularly true for rural areas, as these tend to be sparsely populated, so safety and privacy become less of an issue than in the case of densely populated urban areas. Third, many rural properties are extensive, making the timely detection of problems difficult by ground scouting alone. Since UAVs are capable of covering large areas much faster than people on the ground, they can be a great scouting tool, especially if used in combination with other sensors on the ground [9]. Although satellites can also cover large areas, their resolutions are still not enough for a fine-grained crop analysis. Fourth, imaging sensors have evolved considerably in the last decade. As a result, images with resolutions much higher than those offered by satellites can be obtained even with the UAV flying at high altitudes [4,10]. In turn, this makes it possible to detect problems before they become widespread. Fifth, UAVs have become much easier to operate because flight missions can be entirely automated using offline flight planning. Finally, image processing and machine learning tools have evolved to a point in which visual cues contained in an image can be successfully extracted and transformed into useful information for the management of the farm. Among those tools, deep learning is arguably the most impactful. This kind of technique models high-level data abstractions using a deep graph with several processing layers containing linear and non-linear transformations, having the potential to achieve good results without explicitly taking into account the many factors that influence the classification [11], which is particularly valuable in the case of remote sensing images [12].
Although the use of UAVs in agriculture has been steadily increasing, such growth is hindered by many technical challenges that still need to be overcome. Many studies trying to address those challenges have been carried out in recent years with various degrees of success. As the number of articles on the application of UAVs to agricultural problems grows, it becomes more difficult to track the progress that is being made on the subject. To make matters even more complicated, the variety of targeted applications is high, including tasks such as stress detection and quantification [13,14,15,16,17], yield prediction [18,19,20,21,22,23,24], biomass estimation [20,25,26,27], vegetation classification [28,29,30,31], canopy cover estimation [32,33,34,35,36], plant height assessment [37,38], and lodging [39,40], among others [41,42,43,44,45,46,47,48,49]. Each one of those applications has specific characteristics that must be taken into consideration to properly contextualize the impact of UAV-based technologies. Among those applications, stress detection and quantification is arguably the one that has received the greatest amount of attention, most likely due to the potential positive impact that early stress detection can have on the agricultural activity. As a consequence, a large amount of data has been generated and a wide variety of strategies have been proposed, making it difficult to keep track of the current state of the art on the subject and the main challenges yet to be overcome. In this context, the objective of this article is to provide a comprehensive overview of the progress made on the use of UAVs to monitor and assess plant stresses such as drought, diseases, nutrition deficiencies, pests, and weeds.
This article is primarily organized according to the type of stress (drought, diseases, nutrition disorders, pests, weeds, and temperature). Relevant aspects to each of those stresses are discussed in the respective sections, and more general aspects are discussed in a dedicated section. Basic information about all references identified for each type of stress is summarized in tables. The accuracies, coefficients of determination, and correlations associated to those studies are not presented because they are heavily context-dependent and cannot be directly compared.
The search for contextually relevant articles was done using Google Scholar and the scientific repositories Scopus, Sciencedirect, and IEEE Xplore. The search was carried out in February 2019 using three groups of keywords: the terms “drone”, “UAV”, and “UAS” were used in different combinations with the terms “stress”, “disease”, “nutrition”, “water”, “pest”, and “weed”, and also with the terms “plant” and “agriculture”. After the first batch of references was selected, their own list of references was scouted in order to identify more articles of interest. This process resulted in the selection of more than 100 references.
It is worth noting that sensors other than imaging devices can be attached to UAVs, such as spectrometers [50] and LiDAR (Light Detecting and Ranging) [51], but this article focuses solely on imaging sensors. It is also important to note that a detailed description of the different types of UAVs and sensors is beyond the scope of this work, but the literature contains abundant material about UAVs [4,7,52,53,54], sensors [4,55], and their calibration [56,57,58,59,60].

2. Literature Review on Plant Stress Detection and Quantification Using UAVs

2.1. Drought

The water status in a field is usually determined by taking multiple punctual measurements over the field, but this method often fails to properly represent the actual field heterogeneity. With the evolution of sensors and aircraft, indicators derived from remotely sensed images allow the characterization of entire fields with enough resolution to analyze plants individually [61]. This fact has motivated many research groups to explore this option in the last decade (Table 1).
Almost half of the studies applying UAVs for water stress analysis identified in this review were carried out in Spain in semiarid areas with significant production of fruits. This is explained by the fact that the orchard performance in semi-arid environments is closely related to the irrigation supply, which must be constantly monitored in order to optimize water productivity while maintaining the yield and economic return to growers [69].
The characterization of water stress on crops and orchards is a complex task because the effects of drought affect (and can be affected by) several factors [68]. For this reason, it is important to consider both the physiological effects and spatial variability of the obtained data. The vast majority of UAV-based strategies dealing with crop drought try to synthesize the information contained in the images into a variable that is highly correlated to well-established ground measurements. The variables obtained by UAV and on the ground are usually related by means of some kind of regression analysis, that is, UAV-based measurements can be used to estimate ground values using simple equations (most frequently first order polynomials).
With very few exceptions, the adopted ground reference is one of the following four variables:
-
The leaf water potential ( Ψ L ) used in References [63,85,86] quantifies the tendency of water to move from one area to another in the leaf.
-
The stem water potential ( Ψ S ) used in References [61,69,74,77,79,81,88] quantifies the tendency of water to move from one area to another in the stem.
-
The stomatal conductance used in References [13,64,68,72,74,79,83,85,86] represents the carbon dioxide flow rate based on opening leaf pores.
-
The water content used in Reference [67] is given as a percentage with respect to a reference value.
Despite the variety of crops and conditions that have been studied, most strategies proposed in the literature extract one of the following four variables from aerial images:
-
The vegetation indices (NDVI, GNDVI, etc.) used in References [68,80,85,88] are the result of spectral transformations aiming at highlighting certain vegetation properties.
-
The photochemical reflectance index (PRI) used in References [64,67,81,82,85,86] is a reflectance measurement sensitive to changes in carotenoid pigments present in leaves.
-
The difference between the canopy and air temperatures ( T c T a ) used in Reference [69]; some studies use the canopy temperature directly [68,72].
-
The crop water stress index (CWSI),used in References [13,63,69,74,77,79,86] is based on the difference between canopy temperature and air temperature ( T c T a ), normalized by the vapor pressure deficit (VPD) [86]. A related variable, called Non Water Stress Baseline (NWSB), was also used in some investigations [61].
The former two are associated with multispectral or hyperspectral images, and the latter two are associated with thermal infrared imagery. Water stress is the only application considered in this review for which thermal sensors are widely used. The rationale behind this is that water stress induces a decrease in stomatal conductance and less heat dissipation in plants, causing a detectable increase in the canopy temperature [68,70]. However, under certain conditions and especially when rewatering occurs after a long period under deficit levels, variables based on multispectral data such as PRI seem to lead to better results [81,82]. Red-Green-Blue (RGB) images have been employed sparingly, usually associated with multispectral or thermal images for the calculation of hybrid variables such as the Water Deficit Index (WDI) [70]. Chlorophyll fluorescence, calculated using narrow-band multispectral images, has also been sporadically applied to the problem of water stress detection and monitoring [84,85].
The time of day in which images are captured plays an important role in the quality of the thermal measurements [69]. Some experiments have shown that the most favorable time of day to obtain thermal images is around midday, especially between 12:00 and 13:30, as the influence of shaded leaves and the variability in stomatal closure are minimized at this time [63]. In addition, some experiments have shown that to be comparable, measurements should be always made at the same times of the day because intervals as short as one hour can cause significant changes, especially if vegetation indices are used [88].
The problem of shades and shadows was addressed by Poblete et al. [77] by removing the respective pixels using automatic coregistration of thermal and multispectral images combined with a modified Scale Invariant Feature Transformation (SIFT) algorithm and K-means++ clustering. These authors observed a significant increase in the correlation between remotely sensed and reference water stress indicators when shadow pixels were deleted. In the context of multispectral images, conflicting conclusions have been reported: While some authors indicated that the removal of shadowed vegetation pixels clearly improved the correlations [82,88], others observed no improvement [81]. A possible explanation for this is that the impact of shadows may be directly related to the canopy architectures of different plant species [81].
Variables derived from thermal images often rely on very slight temperature variations to detect stresses and other phenomena. As a result, thresholds and regression equations derived under certain conditions usually do not hold under even slightly different circumstances. For example, different genotypes of a given crop may present significantly different canopy temperatures under the same conditions due to inherent differences in stomatal conductance and transpiration rates [64,69,74]. Also, young and old leaves may display highly contrasting gas exchanges, greatly affecting transpiration rates and thus altering the typical values of the chosen variables [61]. A maturity of the crops can also changes the way variable values behave, as senesced crops naturally present lower transpiration [70]. Even canopy architecture has been shown to influence temperature behavior [74]. Such variability indicates that, while general procedures may be valid for a wide range of applications and conditions, specific thresholds and regression equations may need to be recalibrated whenever new conditions are expected.
In the case of multispectral images, factors such as canopy structure variation within the orchard, illumination, and viewing angle geometry effects also play an important role, but selecting the right wavelengths for the calculation of the variables seem to minimize deleterious effects [81].
The spatial resolution required for images to effectively indicate water stress is highly dependent on the characteristics of the crop, especially canopy volume [63] and closure [83]. Some studies have been able to map the water status for individual plants, which can be valuable for saving water resources. For this purpose, the minimum spatial resolution was found to be 30 cm per pixel for vineyards [63], while resolutions coarser than 10 cm were shown to be insufficient to precisely manage citrus orchard systems in which an optimization of the yield with a restricted input of natural resources is endeavored [67]. However, even higher resolutions may be needed to properly characterize the large leaf-to-leaf temperature variability that has been often observed in practice [72]. In addition, Stagakis et al. [81] reported that the presence of mixed pixels containing both soil and vegetation may be very deleterious, which further emphasizes the importance of having resolutions high enough for the majority of the pixels to be either nearly or completely pure.
Hyperspectral sensors used in UAVs provide the very high spectral resolutions needed to characterize subtle physiological effects of drought but are still not capable of delivering the spatial resolutions required for the plant-level management of crops and orchards. Thermal sensors, on the other hand, offer better spatial resolutions but capture only a very limited band of the spectrum. One way to minimize this tradeoff between spatial and spectral resolutions is to apply data fusion techniques capable of effectively explore the strengths of both types of sensors. Some encouraging results towards this goal have already been achieved using real and synthetic data representing citrus orchards [67].
It is interesting to notice that while machine learning classifiers are frequently employed in the detection and monitoring of plant stresses, they are very rarely used in the case of a water status assessment. Only two studies using this kind of strategy were found, both using multilayer perceptron (MLP) neural networks [76,78]. Poblete et al. [76] used a narrow-band multispectral camera to capture certain wavelengths that were used as input to the neural network, which was trained to predict the value of the midday stem water potential. In turn, Romero et al. [78] extracted several vegetation indices from multispectral images to be used as input to two models: The first was responsible for predicting the stem water potential, and the second aimed at classifying the samples into “non-stressed”, “moderately stressed”, and “severely stressed”. The main advantage of artificial neural networks (ANN) in comparison with regression models is that the former is capable of modelling complex nonlinear relationships between variables. On the other hand, the black-box nature of ANNs may lead to bias and overfitting if the network architecture and training process are not carefully designed.
All approaches found in the literature that deal with the water status assessment in crops and orchards have shortcomings that still prevent their practical adoption. The methods based on thermal images suffer a strong negative influence from factors such as soil background and shadows, canopy architecture, complex stomatal conductance patterns, changes on physiological factor due to a diversity of elements, etc. Indicators derived from multispectral or hyperspectral images are affected by variations in canopy structure, illumination, and angle of capture, and spatial resolutions are often insufficient for precision agriculture purposes. RGB images only detect stress signs when they are clearly visible and irreversible damage may have already occurred. With the development of new sensors and the application of more sophisticated techniques of computer vision and machine learning, those limitations tend to become less relevant. However, truly robust solutions may only be possible by using fusion techniques to combine data coming from different sources in such a way the limitations inherent to each data source can be systemically compensated by exploring the strengths of its counterparts.

2.2. Nutrition Disorders

Currently, the most common way to determine the nutritional status is visually, by means of plant color guides that do not allow quantitatively rigorous assessments [90]. More accurate evaluations require laboratorial leaf analyses, which are time consuming and require the application of specific methods for a correct interpretation of the data [91]. There are some indirect alternatives available for some nutrients, such as the chlorophyll meter (Soil-plant analyses development (SPAD)) for nitrogen predictions [92], but this is a time consuming process [93] and the estimates are not always accurate [94]. Thus, considerable effort has been dedicated to the development of new methods for the detection and estimation of nutritional problems in plants [95].
A large portion of the nutrition deficiency studies found in the literature employ images captured by satellites [96,97,98]. Although some satellites launched recently can deliver submeter ground resolutions [99], those are still too coarse for plants to be analyzed individually, meaning that in many cases the deficiency can only be detected when its already widespread. UAVs, on the other hand, can deliver ground sample distances (GSD) of less than one centimeter without the high costs and operation difficulties associated with manned aircraft [4]. Table 2 summarizes the characteristics of studies applying UAVs to nutrition status monitoring. It is worth noting that many of the references in Table 2 target not only the nutrients themselves but also the effects that the lack of excess of nutrients can have on yield, biomass, canopy cover, etc. Although relevant, these other applications are beyond the scope of this review and will not be discussed.
Nitrogen is, by far, the most studied nutrient due to its connection to biomass and yield. Potassium [103] and sodium [103,119] have also received some attention. Multispectral images have been the predominant choice for the extraction of meaningful features and indices [9,101,102,104,107,110,111,112,115,119,120,121,125], but RGB [26,105,106,108,113,125] and hyperspectral images [9,116] are also frequently adopted. Data fusion combining two or even three types of sensors (multispectral, RGB, and thermal) has also been investigated [26].
The vast majority of the studies found in the literature extracts vegetation indices (VI) from the images and relates them with nutrient content using a regression model (usually linear). Although less common, other types of variables have also been used to feed the regression models, such as the average reflectance spectra [103], selected spectral bands [114,119,127], color features [118,123], and principal components [122]. All of these are calculated from hyperspectral images, except the color features, which are calculated from RGB images.
Only a few methods to determine nutrition status did not employ regression models: Discriminant analysis has been used for determining potassium status in canola [119], and machine learning classifiers like SVM [124] and random forest [15] have been used to determine nitrogen status in corn and wheat.
Vegetation and spectral indices that are more related to pigments and chlorophyll content, such as the simplified canopy chlorophyll content index (SCCCI), transformed chlorophyll absorption reflectance index normalized by the optimized soil adjusted vegetation index (TCARI/OSAVI), triangular greenness index (TGI), and double peak canopy nitrogen index (DCNI), often perform better than indices more sensitive to biomass and canopy structure, such as the normalized difference vegetation index (NDVI), visible atmospherically resistant index (VARI), and normalized difference red edge (NDRE). This conclusion is supported by experiments in which chlorophyll-sensitive indices performed better at the early stages of the cotton season [101] and at several stages of the corn [9] and wheat [107] growing season. Further evidence came from the fact that GNDVI and NDVI acquired from a small UAV were not useful for an in-season nitrogen management in potato crops [110]. On the other hand, pigment-sensitive indices seem to be more affected by the spatial resolution of the images, probably due to the higher sensitivity of those indices to uncovered soil effects [9], which are harder to remove with coarser resolutions due to the strong presence of mixed pixels. Thus, the advantage of this type of index is directly connected to the resolution of the images used to derive their values.
In many instances, the nitrogen status is estimated indirectly from other variables, especially the chlorophyll content and leaf area index. Since the chlorophyll content is easier to measure than the nitrogen content (for example, using the SPAD device) [128], it is sometimes used as the target variable in investigations regarding remote nitrogen monitoring [110,117]. The problem with this approach is that the chlorophyll content may not be an accurate proxy of the nitrogen content if the concentration of the latter is high [111]. It is important to take this type of mismatch into account when using proxies for variables of interest.
Uncovered soil and shadowed soil may have an important impact on vegetation indices, especially when canopy cover is still small [102,105]. This not only can lead to significantly different variable values depending on the growth stage (even if the actual nutritional status does not change) but also can cause biased results due to the effects of the nutritional status on the canopy architecture. When there is a deficit in nitrogen, the leaf expansion is limited (more soil is uncovered), and when there is nitrogen excess, canopy closure happens earlier. As a result, spectral profile differences detected by many investigations may be more deeply connected to differences in the ground covering than to the actual spectral properties of the plants [40]. Despite this, the removal of background pixels does not always improve the results [121]. Leaf shadows have also been found to decrease accuracy, especially when canopies are fuller and more widespread [100]. In order to reduce the amount of mixed pixels, to properly resolve problematic areas (uncovered soil and shadows), and to address potential issues, high resolution images are often required [105]. There are, however, exceptions: In an investigation in which multispectral images were used to monitor nitrogen status in sunflower crops, no significant changes in accuracy were observed when GSDs varying from 1 to 100 cm were adopted [121]. Also, in a study aimed at the detection of potassium deficiency in canola, multispectral images acquired at a higher altitude (GSD of 6.5 cm) yielded better results than images with a GSD of 0.8 cm [119]. Two possible explanations were offered for this counterintuitive result: (1) The effects of a reduced Leaf Area Index (LAI) in K-deficient plots were more effectively integrated in the low resolution images; (2) the registration and ortho-rectification errors were more prominent for the high-resolution images, as movement blur was more intense due to the camera being closer to the subject.
Some investigations have come to the conclusion that the nutrient status can be more accurately predicted under overcast conditions, as shadows are smoother and less prevalent [100,106]. However, other authors came to the opposite conclusion, citing spectral alterations caused by diffuse lighting as the main reason for recommending image acquisitions under direct sunlight [113]. It is worth mentioning that, under sunny conditions, pictures taken when the sun is too far from the nadir may contain large shaded areas, which can cause a loss of information and, as a consequence, a drop in accuracy [100]. Thus, it is often recommended that UAV missions be carried out around midday [106].
Plant traits vary according to the growth stage [116], and such a variation may have a significant influence on the ability of algorithms and models to correctly predict nutrition status, especially in the case of nitrogen [103]. Different studies have reached very different conclusions regarding the growth stage that is more favorable for remote nutrition analysis. The nitrogen status has been shown to be more easily estimated at the latter stages of development of crops like cotton [101]. A possible explanation is that soil and water background effects are more prevalent when plants are smaller; additionally, younger plants tend to have a higher biomass production rate compared to the nitrogen accumulation (N dilution effect) [101]. Conversely, the nitrogen concentration estimation was less accurate at later stages of development in the case of oat [116]. The explanation for this was that, as plants mature and start senescing, nitrogen and biomass are gradually reallocated to grains, reducing photosynthetic capacity and causing leaf discoloration. The problem of leaf senescence can be mitigated by including biophysical parameters, which can be derived from crop surface models [118] or other sources of information. In turn, nitrogen monitoring in rice was difficult in both early (jointing) and late (filling) growth stages, due to canopy mixing with soil background during the early stage and then with panicles later in development [15]. Finally, Benincasa et al. [102] did not observe significant differences in the accuracy between wheat crops at early and late seasons; however, they remarked that atypically intense rainfalls and preexisting soil conditions may have affected the results. Some studies have brought evidence that exploring the high spectral resolution of hyperspectral images to select the wavelengths that are more representative of each growing stage may be an effective way to address this issue and reduce inconsistencies [127].
N-deficit crops characteristically have decreased chlorophyll contents, stunted heights, thin stalks, and small, young leaves [40]. Difficulties arise from the fact that other stresses may produce those kinds of effects, including flooding and low temperatures [102]. Thus, the process of reliably determining the nutritional status in plants may need to include side information (variation of soil properties, weather, crop types, etc.) to contextualize the visual cues found in aerial images and to properly feed decision support systems [110].
Only three articles investigated the application of classical machine learning classifiers to nutrition status monitoring. Liu et al. [114] employed a Multilayer Perceptron neural network and a multiple linear regression model to determine the nitrogen status in wheat using hyperspectral images; the results obtained with the neural network were superior to those obtained using the regression model. Zermas et al. [124] employed K-means clustering to group pixels and SVM to find yellow pixels, but the classification itself was performed by a logistic regression model. Finally, Zheng et al. [15] tested seven types of machine-learning classifiers, together with six types of regression, for monitoring nitrogen in wheat, with Random forest (RF) yielding the best results.

2.3. Diseases

Traditionally, disease detection in crops has been carried out visually by people with some training or experience detecting plant disorders. As for any activity carried out by humans, this approach is subject to psychological and cognitive phenomena that may lead to bias, optical illusions, and ultimately error [129]. Image-based tools can, thus, play an important role in detecting and recognizing plant diseases when human assessment is unsuitable, unreliable, or unavailable [130], especially with the extended coverage provided by UAVs.
As it can be seen in Table 3, contrasting with the studies about water and nutrient status, which are strongly dominated by regression models, machine-learning classifiers have been used frequently for disease detection and quantification [14,131,132,133,134,135,136,137,138,139]. RGB [131,132,135,136,138,139,140,141,142,143] and multispectral images [133,134,137,143,144,145,146,147,148,149,150,151] have been preferred methods for acquiring information about the studied areas, but hyperspectral [14,65,152] and thermal images [65,147,152,153] have also been tested. The latter is employed mostly to detect water stress signs potentially caused by the targeted disease.
Barbedo [154] discussed several challenges involved in automatic plant disease identification. Such a review, although focused on proximal sensing using visible range images, addresses several points that are relevant to UAV-acquired images:
-
Image background: Isolating plants from the background (mostly soil in the case of UAV-acquired images) can be a difficult problem itself, and depending on the spatial resolution (GSD) of the images, mixed pixels (plant + soil, plant + shadows) will inevitably be present even if the plant segmentation is accurate, decreasing accuracy [144,145,147]. Some authors use heavily nonlinear techniques, such as Convolutional Neural Networks, in order to address the problem of mixed pixels at the borders of the regions [132]. This type of approach can be very effective, but it depends on large amounts of carefully annotated images to work properly [130], otherwise the significance of the findings is limited [135]. Errors can also be minimized by doing the segmentation manually, but this can be a very labor-intensive task, and more importantly, the resulting method for disease detection will no longer be fully automatic, drastically reducing its appeal. In any case, the presence of weeds may make it very difficult to delineate the regions of interest and, consequently, to correctly detect and quantify the diseases [141].
-
Image capture conditions: Illumination concerns are especially important in the field, where aspects such as time of day, position of the sun with respect to the leaf, and overcast conditions can greatly affect image characteristics. In general, the recommendation is that images are either captured with overcast conditions or with the sun close to the nadir. Also, a perpendicular angle of capture is usually preferred to avoid perspective and occlusion issues. It is worth noting that some authors have elected to carry flight missions at night, with plants being illuminated by a polarized light specifically designed to highlight the effects of the targeted disease [136].
-
Symptom variations: Most plant diseases produce physiological alterations that can be detected in certain bands of the spectrum. The problem is that those alterations can be highly variable depending on factors other than the disease itself, such as cultivar [140,142,144,145], leaf age [145], disease severity [145], weather conditions [148], and the presence of other stresses, among others. Designing experiments that take into account all those variations may be challenging or even unfeasible, which may lead to methods with limited practical use.
-
Other disorders and stresses: Experiments usually consider only the disorder(s) of interest and healthy plants (control). In practice, there are many factors that can cause responses similar to the targeted disease [145], and multiple stresses can be present simultaneously. Such a large degree of variability found in the real world is very difficult to emulate in any investigation. As a result, methods that performed well in controlled experiments often fail under more realistic conditions. This fact has led some authors to consider the possibility that UAV imagery may have its potential limited to acting as an alarm for anomalous coloring that would need to be checked in field to determine its origin [144]. Additional information such as historic data about the crop [148] and digital surface models (DSM) revealing canopy height [141] may be valuable in this kind of context, as they may provide answers that can resolve potential ambiguities.
-
Covariate shift: Another problem that is very common arises from differences between the distributions of the data used to train the model and the data on which the model is to be applied, a situation that is commonly called covariate shift [155]. Although there are many research groups working on solutions based on domain adaptations to mitigate this problem [156], a still unavoidable consequence of this problem is that a calibration step is needed whenever different conditions and geographic areas are to be considered. This problem has been recognized by a few authors, who added that a calibration is often unfeasible in an operational context, as data collection for this task is time- and labor-intensive [144].
Experiments involving plant diseases can be difficult to operationalize because the affected plants need to remain untreated until all the experiments are carried out, which may cause the disease to spread to neighboring areas. To avoid an actual disease outbreak, some experiments have used proxies to the actual disease (e.g., herbicides) to cause physiological stress and to simulate the onset of symptoms [133,134]. It is important to emphasize, however, that while this kind of approach is useful if the objective is a general proof of concept, if specific diseases are targeted, their specificities need to be explicitly investigated in practical experiments.
Most experiments indicate that higher spatial resolutions lead to better results, since more meaningful information can be extracted when plants and respective structures can be analyzed individually [14]. However, there are some exceptions. Dash et al. [133,134] observed an increase in accuracy when the original multispectral images used for the detection of disease stress in pine trees were downsampled. They hypothesized that the increase in the sensitivity of resampled spectral indices might be a product of the resampling process, which can potentially act like a filter that increases the signal-to-noise ratio when the subject phenomenon is equal to or larger than the pixel size. Because of this, the authors remarked that this result should not be interpreted as direct evidence that a coarser spatial resolution imagery has a greater utility for stress monitoring.
To be useful, in most cases, the detection of diseases should occur as early as possible in order to avoid significant losses. However, disease signs tend to be very slight in the beginning of the infection, making detection difficult even while using proximal imagery with very high spatial and spectral resolutions [154]. Indeed, the correlation between UAV-derived and manual ratings for tomato spot wilt has been shown to consistently increase as the season progressed, which can be explained both by infected areas becoming larger and by the increased impact of the disease on canopy size and its health [151]. Other authors have reported high error rates when trying to detect diseases in plants with low infection levels [145].
Some diseases are known to simultaneously produce different kinds of changes. This led some authors to combine different spectral domains to address the disease detection problem more systemically. For example, Calderón et al. [147] combined indices from the visible, red edge, near infrared, and thermal spectral regions to detect downy mildew in opium poppy. While this strategy may significantly improve the detection capabilities, it is important to consider that it requires multiple sensors, increasing operation costs, and the payload to be carried by the UAV.
As stated previously, some studies used thermal images to detect changes in the water content as an indirect sign of the presence of certain diseases [65,147,152,153]. For this reason, the experimental characteristics and challenges faced by those studies are closely related to the observations drawn in the “Drought” section.

2.4. Others

The problem of pest detection using digital images shares several similarities with disease detection, as insects are primary vectors for many of the most important plant diseases. Thus, it is no surprise that factors such as illumination, angle of capture, and shadows have also been shown to have a significant impact on pest detection [16,157]. Another point in common is that the visual and spectral cues used to detect pests may be the result of several different factors, including other insects, which again stresses the need for other types of data for an unambiguous identification [16,157].
This close relation between insects and diseases may explain the relatively low number of studies on pests compared with the other stresses, as the identification of their effects is often treated simply as disease detection. However, although the deleterious effects of pest infestations are often linked to transmitted diseases, the insects themselves can cause considerable damage [158]. An effect that is common in forests and orchards is the defoliation of trees, a cue that has been explored in a few experiments [157,159]. Also, insects can disrupt physiological processes and photosynthesis, causing an impact on plant height, an effect that can be detected through photogrammetry [158,160].
Much of the research on pest detection focuses on forests (Table 4). This kind of area tends to be expansive and with limited access, making it difficult to obtain in situ reference data for the calibration of alternative methods. In addition, tall trees pose a significant treat to UAVs, often making regular scouting missions too risky and cost ineffective. In this context, some researchers have used UAVs for the specific purpose of gathering ground-truth data for calibrating satellite data possessing a comparatively lower GSD [16]. Thus, in cases like this, UAVs are not used for the pest monitoring itself but as an efficient tool for the acquisition of reference data.
Weed detection has also received some attention (Table 5). The general idea is to locate damaging weed species so herbicides can be precisely applied, thus reducing costs and environmental impacts. Because weeds can often be recognized in high-resolution aerial images by their distinctive canopy architectures and intermittence of their patches [163], detailed spectral information may not be as important as in the identification of other types of stress. Indeed, this kind of image has been preferred, with the adopted strategies including the use of mathematical morphology combined with deep learning for weed classification [163], the use of the Excess Green Index (ExG) combined with linear regression to investigate crop resistance to weed harrowing [164], the use of ExG combined with K-means to estimate weed infestation severity [17], and the use of statistical image descriptors combined with Random forest to classify image regions into crop (sugarcane) and two weed species [165]. The only exception was a preliminary study on the viability of using hyperspectral images to discriminate between the spectral signatures of some weeds with different resistances to glyphosate [166]. It is worth mentioning that occlusion by the main crop and poor illumination conditions have been pointed out as the main obstacles for the successful detection of weeds amidst commercial crops [163].
The last stress that has been studied using UAVs is related to extreme temperature damage. Only two investigations that fit this category were identified (Table 6), both dealing with heat damage. Gennaro et al. [167] evaluated heat and radiative stress effects in vineyards in terms of temperature at the cluster and canopy levels, using both thermal and multispectral data. Malenovsky et al. [168] used UAVs to monitor the impact of climate change on Antarctic vegetation; they employed support vector regressions (SVR) to extract relevant information from the hyperspectral images, which in turn was used to infer Antarctic moss vigor as a function of the plant canopy chlorophyll content and leaf density.

3. General Discussion

As discussed in the previous section, each type of stress has specific aspects and factors that need to be considered when monitored by UAVs. There are, however, many issues that are common to all contexts considered in this review. This section’s aim is to address such issues, providing a comprehensive overview about their most relevant consequences and how deleterious impacts can be mitigated.
This section can be divided into two parts. The first part revisits some of the issues that were raised in Reference [4] in the context of UAV-based livestock monitoring, now addressing them in the context of stress detection and monitoring. Aspects that have already been discussed in detail in Reference [4] will only be briefly mentioned, with the reader being referred to that work for more information. The second part discusses new issues that were inferred from the references selected for this review, thus receiving a more detailed treatment.

3.1. Revisited Issues

There is a growing perception among different players in the agricultural sector that UAVs can be valuable tools for gathering information quickly and reliably, especially in areas that are remote and hostile. It is important to consider, however, that UAVs may be challenging to pilot and that the learning curve may be steep, even with GPS-enabled navigation. Rotary aircraft are, in general, easier to operate due to the vertical take-off and landing, but incidents are common even among experienced users. Landing UAVs in rough terrain, which is common in agricultural areas, is particularly challenging, especially in the case of fixed-wing aircraft. To minimize damage in case of an unsuccessful landing, it is recommended to keep the aircraft and payload weight substantially below nominal limits [10].
The threat of crashes and equipment loss is always present, as there are many hazards that can cause a UAV to crash: high winds, birds of prey, power lines, trees, signal loss, and so on. Although careful planning and monitoring can greatly reduce the risk, incidents are sometimes unavoidable, potentially causing damage to aircraft and sensors. In this context, it is recommended that spare parts and even extra UAVs always be available for replacement, especially when there are time constraints for completing the survey.
High speed winds may cause yaw, pitch, and roll movements and can affect the speed of the aircraft. Because the compensation by stabilization mechanisms has some response latency, angular movement will be inevitable, especially in the case of small rotary UAVs. Such an angular movement alters the overlapping between images and deflect sensors from nadir, thus damaging the mosaicking process and introducing a variety of distortions. Correcting those distortions is not a trivial task, but it is unavoidable if they are very prominent or if the method to be applied relies heavily on the geometrical characterization of the image [19,55,59,64,67,81,107,114,127,142,153,159,167]. Another problem associated to high wind speeds is that canopies will move and change appearance, which may cause inconsistencies and may impede proper image alignment due to the impossibility of finding enough common key points between the photographs.
Although using UAVs for imaging is usually considered a cheaper alternative to manned missions (and in some cases, even to satellite imagery), there are many factors that need to be considered. In order to be properly monitored, large properties may require the use of a more sophisticated aircraft, which can be many times more expensive than low-end UAVs. Because the risk of crashes is still high, insurance costs may also be significant. Ultimately, costs and benefits will strongly depend on the characteristics of the properties and on the intended uses, so a careful economical and technical analysis is recommended before deciding whether UAVs are advantageous.
Lightweight UAVs tend to be cheaper and easier to use, but they also have a limited payload capacity. This constrains not only the sensors that can be deployed but also the size of the batteries, thus limiting flying time and, as a consequence, area coverage [10]. There are a few solutions for this problem, such as flying higher (at the cost of having lower GSD), using larger UAVs (which are more expensive and harder to operate), flying in formation (an expensive and difficult to implement option), and exploring solar energy (there are practical and technical issues that still need to be overcome). Fortunately, as technology evolves, some of the problems related to the payload capacity are minimized and more suitable solutions arise.
In order to save battery resources, to decrease the time needed to cover the areas of interest [118], and to expedite the mosaicking process, images should be captured from the highest possible altitudes with a minimal overlap between them. The minimum level of overlapping will depend directly on the characteristics of the terrain, on the robustness of the mosaicking algorithm, and on how critical it is to avoid areas with missing data. The ideal height is the maximum altitude above which the sensors of choice no longer deliver enough resolution for a robust identification of the objects of interest, as long as legal limits are observed. Although the best possible setup can only be attained by carefully studying the specific characteristics of each survey [40], it should be possible to derive some general guidelines that are a reasonably good fit in most cases.
Payload constraints mean that sensors should be as miniaturized as possible. However, in the specific case of imaging sensors, usually there is a tradeoff between miniaturization and data quality. There are some miniaturized sensors that can deliver an optical quality similar to their larger counterparts, but they tend to be considerably more expensive [169]. This has to be taken into account for stress monitoring because, depending on the GSD, optical distortions may render stress identification unfeasible.
Although the rules that regulate the use of UAV are evolving towards a better balance between security and practical use viability, they are, in general, strict enough to cause problems in many situations. Regulations vary from country to country, but some rules are common to most of them, including the need for a pilot license and a special exemption depending on the size of the aircraft, the need to keep the aircraft within a visual line of sight, and the limits for the maximum flying altitude (typically 120 m) and speed, among others. The rules tend to be less stringent in sparsely populated areas, but they still need to be considered when employing UAVs for stress monitoring.

3.2. Specific Issues

The type of sensor plays an important role in the ability that UAV-based monitoring has to detect the stresses of interest [125]. Conventional RGB cameras can deliver very high spatial resolutions even with UAV’s flying high, and they tend to be much more affordable than other types of sensor, but their spectral information is very limited. Multispectral cameras can deliver high spatial resolutions and detailed reflectance information for a few bands with relatively wide bandwidths but tend to be considerably more expensive than RGB cameras. Typical spatial resolutions for hyperspectral sensors are much lower than those delivered by RGB and multispectral cameras, but they offer a very high resolution spectral profile containing reflectance information for hundreds of narrow spectral bands. This allows for the detection of finely localized spectral alterations caused by the stresses [160], but the relatively low spatial resolution may cause problems due to mixed pixels. This type of sensor also tends to be the least affordable. Finally, thermal sensors aim to detect the minute temperature alterations caused by certain stresses, with spatial resolutions that tend to be on par with those delivered by a hyperspectral sensor. Choosing the right sensor for a given application is a complex task that should always take into account the tradeoff between the potential gain in accuracy and the cost associated with more sophisticated sensors. The information contained in the literature can be very helpful towards this goal, but a definite answer can only be achieved by considering all particularities of the desired application.
Most studies on stress monitoring and detection carried out so far address the issue as a binary problem (the presence of an absence of the stress of interest). The experiments are usually done using a control group (healthy plants) and a group for which the stress of interest is induced. The problem with this approach is that there are several other types of stress that can occur in a real environment, all capable of inducing physiological and spectral alterations that can either mimic those produced by the stress of interest or that can drastically alter the observable effects when occurring simultaneously with the targeted stress [154]. Although considering all possible variability in studies with a limited budget and severe time constraints is often unfeasible, it is important to consider at least the most probable combinations of factors in order to increase the practical applicability of the methods being proposed.
As discussed previously, the operation of UAVs can be expensive, especially if more sophisticated equipment is used. This led some authors to speculate that, in many situations, UAVs may not be cost effective when compared with alternative sources of data such as active proximal sensors [110]. On the other hand, UAVs are highly flexible for monitoring different parameters over the growing season, so more information can be obtained at a lower total cost [110].
The significance and reach of the results reported in a given study are directly related to the quality of the ground-truth data used as a reference to evaluate the proposed techniques and methods. However, the acquisition of high-quality reference data can be challenging. The collection of reference data often requires a trip to the experimental field for the measurements to be performed in loco. The sites selected for data sampling often are difficult to access, and proper data collection may become unfeasible under poor weather conditions. Additionally, the reference data needs to properly align with the test data and to cover all variability expected to occur within the parameters established for the experiments [116]. For example, if the objective is to monitor a certain disease, the reference data must consider the whole range of possible symptoms and severities that can be found in practice. Given the variety of conditions and interactions between different factors that can occur in practice, it is very difficult to be sure that a given reference set is representative enough. However, an effort should always be made to include as much diversity as possible in order to guarantee that the experimental results are a good representation of reality.
Some experiments based on digital images do not require the collection of reference data in the field, rather relying on the manual identification of the features of interest in the images. This image labelling process, being a subjective task, can be very error prone [14], especially if the features of interest tend to blend with the rest of the image. The subjectivity involved in the process is mostly unavoidable, but labeling errors can be reduced by employing multiple human raters and a majority vote system.

4. Conclusions

As technology evolves, different methods to solve the problems that affect the agricultural activity are constantly being proposed. One aspect that is common to all those methods is the need for reliable data sources to work properly. Proximal, UAV-borne, and satellite-borne imaging sensors are all being extensively tested for a wide range of applications, and the results seem to indicate that they are complementary rather than competing for the same space. It is very difficult to indicate which would be the best approach without knowing the details of the desired application because there are too many factors that can influence the decision (costs, spatial resolution, area coverage, type of image to be used, type of feature of index to be calculated, etc.). It is interesting to notice that, even when the same type of sensor is used, the information that can be acquired from each approach can be very different, which has led some authors to suggest that combining the information contained in images acquired at different scales could greatly improve the stress detection and monitoring capabilities [16].
As encouraging as the experimental results have been, a practical adoption of image-based stress monitoring has been slow, regardless of the type of sensor and deployment method. One possible reason for this is that there are so many factors capable of altering the physiological and morphological characteristics of the plants under practical conditions that unambiguous answers based solely on the information contained in the images become unfeasible with the current technology. This seems to indicate that current methods for stress monitoring have limited value in isolation but could be very useful as part of larger knowledge-based structures adopting a systemic view capable of combining different types of data, including weather information, historic data about disease incidence, irrigation practices, pesticide applications, etc. In any case, sensor and UAV technologies will continue to evolve, and more powerful machine learning techniques will continue to be proposed. At the same time, new experiments will lead to a better understanding about plant physiology and how different stresses affect biological processes. Also, with a growing number of research groups making their experimental data freely available, it will be possible to test new methods more rigorously, which will lead to a better comprehension of the actual capabilities and limitations of each type of sensor and the method of deployment. With plenty of scientific questions still unanswered, these advancements represent an excellent opportunity for researchers to explore.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Zhang, C.; Kovacs, J.M. The Application of Small Unmanned Aerial Systems for Precision Agriculture: A Review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  2. Hunt, E.R., Jr.; Daughtry, C.S.T.; Mirsky, S.B.; Hively, W.D. Remote Sensing with Simulated Unmanned Aircraft Imagery for Precision Agriculture Applications. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4566–4571. [Google Scholar] [CrossRef]
  3. Beloev, I.H. A Review on Current and Emerging Application Possibilities for Unmanned Aerial Vehicles. Acta Technol. Agric. 2016, 19, 70–76. [Google Scholar] [CrossRef]
  4. Barbedo, J.G.A.; Koenigkan, L.V. Perspectives on The Use of Unmanned Aerial Systems to Monitor Cattle. Outlook Agric. 2018, 47, 214–222. [Google Scholar] [CrossRef]
  5. Hogan, S.D.; Kelly, M.; Stark, B.; Chen, Y. Unmanned Aerial Systems for Agriculture and Natural Resources. Calif. Agric. 2017, 71, 5–14. [Google Scholar] [CrossRef]
  6. Mulero-Pázmány, M.; Stolper, R.; Essen, L.; Negro, J.J.; Sassen, T. Remotely Piloted Aircraft Systems as A Rhinoceros Anti-Poaching tool In Africa. PLoS ONE 2014, 9, e83873. [Google Scholar] [CrossRef] [PubMed]
  7. Miller, J.O.; Adkins, J.; Tully, K. Providing Aerial Images Through UAVs, 2017. Fact Sheet FS-1056. Available online: https://drum.lib.umd.edu/handle/1903/19168 (accessed on 20 April 2019).
  8. Freeman, P.K.; Freeland, R.S. Agricultural UAVs In The U.S.: Potential, Policy, and Hype. Remote Sens. Appl. Soc. Environ. 2015, 2, 35–43. [Google Scholar] [CrossRef]
  9. Gabriel, J.L.; Zarco-Tejada, P.J.; López-Herrera, P.J.; Pérez-Martín, E.; Alonso-Ayuso, M.; Quemada, M. Airborne and Ground Level Sensors for Monitoring Nitrogen Status in A Maize Crop. Biosyst. Eng. 2017, 160, 124–133. [Google Scholar] [CrossRef]
  10. Anderson, K.; Gaston, K.J. Lightweight Unmanned Aerial Vehicles Will Revolutionize Spatial Ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef]
  11. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Proceedings of the Annual Conference on Neural Information Processing Systems, Lake Tahoe, ND, USA, 3–6 December 2012; pp. 1106–1114. [Google Scholar]
  12. Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning, 1st ed.; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
  13. Matese, A.; Di Gennaro, S.F. Practical Applications of A Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef]
  14. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial Mapping of forests Affected By Pathogens Using UAVs, Hyperspectral Sensors, and Artificial Intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [PubMed]
  15. Zheng, H.; Li, W.; Jiang, J.; Liu, Y.; Cheng, T.; Tian, Y.; Zhu, Y.; Cao, W.; Zhang, Y.; Yao, X. A Comparative assessment of Different Modeling Algorithms for Estimating Leaf Nitrogen Content in Winter Wheat Using Multispectral Images from an Unmanned Aerial Vehicle. Remote Sens. 2018, 10, 2026. [Google Scholar] [CrossRef]
  16. Otsu, K.; Pla, M.; Vayreda, J.; Brotons, L. Calibrating The Severity of forest Defoliation By Pine Processionary Moth with Landsat and UAV Imagery. Sensors 2018, 18, 3278. [Google Scholar] [CrossRef] [PubMed]
  17. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.S.; Neely, H.L.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [PubMed]
  18. Zaman-Allah, M.; Vergara, O.; Araus, J.L.; Tarekegne, A.; Magorokosho, C.; Zarco-Tejada, P.J.; Hornero, A.; Albà, A.H.; Das, B.; Craufurd, P.; et al. Unmanned Aerial Platform-Based Multi-Spectral Imaging for Field Phenotyping of Maize. Plant Methods 2015, 11, 35. [Google Scholar] [CrossRef]
  19. Yu, N.; Li, L.; Schmitz, N.; Tian, L.F.; Greenberg, J.A.; Diers, B.W. Development of Methods to Improve Soybean Yield Estimation and Predict Plant Maturity with An Unmanned Aerial Vehicle Based Platform. Remote Sens. Environ. 2016, 187, 91–101. [Google Scholar] [CrossRef]
  20. Tattaris, M.; Reynolds, M.P.; Chapman, S.C. A Direct Comparison of Remote Sensing Approaches for High-Throughput Phenotyping In Plant Breeding. Front. Plant Sci. 2016, 7, 1131. [Google Scholar] [CrossRef] [PubMed]
  21. Machovina, B.L.; Feeley, K.J.; Machovina, B.J. UAV Remote Sensing of Spatial Variation In Banana Production. Crop Pasture Sci. 2016, 67, 1281–1287. [Google Scholar] [CrossRef]
  22. Zhou, X.; Zheng, H.; Xu, X.; He, J.; Ge, X.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.; Tian, Y. Predicting Grain Yield In Rice Using Multi-temporal Vegetation Indices From UAV-based Multispectral and Digital Imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  23. Kyratzis, A.C.; Skarlatos, D.P.; Menexes, G.C.; Vamvakousis, V.F.; Katsiotis, A. Assessment of Vegetation Indices Derived By UAV Imagery for Durum Wheat Phenotyping Under A Water Limited and Heat Stressed Mediterranean Environment. Front. Plant Sci. 2017, 8, 1114. [Google Scholar] [CrossRef]
  24. Kefauver, S.C.; Vicente, R.; Vergara-Díaz, O.; Fernandez-Gallego, J.A.; Kerfal, S.; Lopez, A.; Melichar, J.P.E.; Serret Molins, M.D.; Araus, J.L. Comparative UAV and Field Phenotyping to assess Yield and Nitrogen Use Efficiency In Hybrid and Conventional Barley. Front. Plant Sci. 2017, 8, 1733. [Google Scholar] [CrossRef] [PubMed]
  25. Possoch, M.; Bieker, S.; Hoffmeister, D.; Bolten, A.; Schellberg, J.; Bareth, G. Multi-Temporal Crop Surface Models Combined with The RGB Vegetation Index From UAV-Based Images for forage Monitoring In Grassland. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 991–998. [Google Scholar] [CrossRef]
  26. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based Phenotyping of Soybean Using Multi-sensor Data Fusion and Extreme Learning Machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  27. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Moreno, M.A. Onion Biomass Monitoring Using UAV-based RGB Imaging. Precis. Agric. 2018, 19, 840–857. [Google Scholar] [CrossRef]
  28. Cruzan, M.B.; Weinstein, B.G.; Grasty, M.R.; Kohrn, B.F.; Hendrickson, E.C.; Arredondo, T.M.; Thompson, P.G. Small Unmanned Aerial Vehicles (Micro-Uavs, Drones) In Plant Ecology. Appl. Plant Sci. 2016, 4. [Google Scholar] [CrossRef] [PubMed]
  29. Baena, S.; Boyd, D.S.; Moat, J. UAVs In Pursuit of Plant Conservation—Real World Experiences. Ecol. Inform. 2018, 47, 2–9. [Google Scholar] [CrossRef]
  30. Ishida, T.; Kurihara, J.; Viray, F.A.; Namuco, S.B.; Paringit, E.C.; Perez, G.J.; Takahashi, Y.; Marciano, J.J. A Novel Approach for Vegetation Classification Using UAV-based Hyperspectral Imaging. Comput. Electron. Agric. 2018, 144, 80–85. [Google Scholar] [CrossRef]
  31. Sankey, T.; McVay, J.; Swetnam, T.; McClaran, M.; Heilman, P.; Nichols, M. UAV Hyperspectral and Lidar Data and Their Fusion for Arid and Semi-arid Land Vegetation Monitoring. Remote Sens. Ecol. Conserv. 2018, 4, 20–33. [Google Scholar] [CrossRef]
  32. Chianucci, F.; Disperati, L.; Guzzi, D.; Bianchini, D.; Nardino, V.; Lastri, C.; Rindinella, A.; Corona, P. Estimation of Canopy Attributes In Beech forests Using True Colour Digital Images From A Small Fixed-wing UAV. Int. J. Appl. Earth Obs. Geoinf. 2016, 47, 60–68. [Google Scholar] [CrossRef]
  33. Duan, T.; Zheng, B.; Guo, W.; Ninomiya, S.; Guo, Y.; Chapman, S.C. Comparison of Ground Cover Estimates From Experiment Plots in Cotton, Sorghum and Sugarcane Based on Images and Ortho-mosaics Captured by UAV. Funct. Plant Biol. 2016, 44, 169–183. [Google Scholar] [CrossRef]
  34. Sankey, T.; Donager, J.; McVay, J.; Sankey, J.B. UAV Lidar and Hyperspectral Fusion for forest Monitoring in the Southwestern USA. Remote Sens. Environ. 2017, 195, 30–43. [Google Scholar] [CrossRef]
  35. Gennaro, S.F.D.; Rizza, F.; Badeck, F.W.; Berton, A.; Delbono, S.; Gioli, B.; Toscano, P.; Zaldei, A.; Matese, A. UAV-based High-throughput Phenotyping to Discriminate Barley Vigour with Visible and Near-infrared Vegetation Indices. Int. J. Remote Sens. 2018, 39, 35–43. [Google Scholar] [CrossRef]
  36. Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering Field-Based Maize Phenotyping of Plant-Height Growth and Canopy Spectral Dynamics Using A UAV Remote-Sensing Approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef]
  37. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High Throughput Field Phenotyping of Wheat Plant Height and Growth Rate In Field Plot Trials Using UAV Based Remote Sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  38. Wang, X.; Singh, D.; Marla, S.; Morris, G.; Poland, J. Field-based High-throughput Phenotyping of Plant Height In Sorghum Using Different Sensing Technologies. Plant Methods 2018, 14, 53. [Google Scholar] [CrossRef] [PubMed]
  39. Chu, T.; Starek, M.J.; Brewer, M.J.; Murray, S.C.; Pruter, L.S. Assessing Lodging Severity over an Experimental Maize (Zea mays L.) Field Using UAS Images. Remote Sens. 2017, 9, 923. [Google Scholar] [CrossRef]
  40. Liu, T.; Li, R.; Zhong, X.; Jiang, M.; Jin, X.; Zhou, P.; Liu, S.; Sun, C.; Guo, W. Estimates of Rice Lodging Using Indices Derived From UAV Visible and Thermal Infrared Images. Agric. For. Meteorol. 2018, 252, 144–154. [Google Scholar] [CrossRef]
  41. Zarco-Tejada, P.; Guillén-Climent, M.; Hernández-Clemente, R.; Catalina, A.; González, M.; Martín, P. Estimating Leaf Carotenoid Content in Vineyards Using High Resolution Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle (UAV). Agric. For. Meteorol. 2013, 171–172, 281–294. [Google Scholar] [CrossRef]
  42. Li, W.; Niu, Z.; Chen, H.; Li, D. Characterizing Canopy Structural Complexity for The Estimation of Maize LAI Based on ALS Data and UAV Stereo Images. Int. J. Remote Sens. 2017, 38, 2106–2116. [Google Scholar] [CrossRef]
  43. Karydas, C.; Gewehr, S.; Iatrou, M.; Iatrou, G.; Mourelatos, S. Olive Plantation Mapping on A Sub-Tree Scale with Object-Based Image Analysis of Multispectral UAV Data; Operational Potential in Tree Stress Monitoring. J. Imaging 2017, 3, 57. [Google Scholar] [CrossRef]
  44. Chen, R.; Chu, T.; Landivar, J.A.; Yang, C.; Maeda, M.M. Monitoring Cotton (Gossypium hirsutum L.) Germination Using Ultrahigh-resolution UAS Images. Precis. Agric. 2018, 19, 161–177. [Google Scholar] [CrossRef]
  45. Johansen, K.; Raharjo, T.; McCabe, M.F. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef]
  46. Maes, W.H.; Huete, A.R.; Avino, M.; Boer, M.M.; Dehaan, R.; Pendall, E.; Griebel, A.; Steppe, K. Can UAV-Based Infrared Thermography Be Used to Study Plant-Parasite Interactions Between Mistletoe and Eucalypt Trees? Remote Sens. 2018, 10, 2062. [Google Scholar] [CrossRef]
  47. Milas, A.S.; Romanko, M.; Reil, P.; Abeysinghe, T.; Marambe, A. The Importance of Leaf Area Index In Mapping Chlorophyll Content of Corn Under Different Agricultural Treatments Using UAV Images. Int. J. Remote Sens. 2018, 39, 5415–5431. [Google Scholar] [CrossRef]
  48. Padua, L.; Marques, P.; Hruska, J.; Adao, T.; Bessa, J.; Sousa, A.; Peres, E.; Morais, R.; Sousa, J.J. Vineyard Properties Extraction Combining UAS-based RGB Imagery with Elevation Data. Int. J. Remote Sens. 2018, 39, 5377–5401. [Google Scholar] [CrossRef]
  49. Oliveira, H.C.; Guizilini, V.C.; Nunes, I.P.; Souza, J.R. Failure Detection in Row Crops From UAV Images Using Morphological Operators. IEEE Geosci. Remote Sens. Lett. 2018, 15, 991–995. [Google Scholar] [CrossRef]
  50. Franceschini, M.H.D.; Bartholomeus, H.M.; Apeldoorn, D.V.; Suomalainen, J.; Kooistra, L. Assessing Changes in Potato Canopy Caused By Late Blight In Organic Production Systems through UAV-based Pushroom Imaging Spectrometer. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 109. [Google Scholar] [CrossRef]
  51. Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of A UAV-LiDAR System with Application to forest Inventory. Remote Sens. 2012, 4, 1519–1543. [Google Scholar] [CrossRef]
  52. Johnson, R.; Smith, K.; Wescott, K. Introduction: Unmanned Aircraft System (UAS) Applications to Land and Natural Resource Management. Environ. Pract. 2015, 17, 170–177. [Google Scholar] [CrossRef]
  53. Mahlein, A.K. Plant Disease Detection By Imaging Sensors—Parallels and Specific Demands for Precision Agriculture and Plant Phenotyping. Crop Pasture Sci. 2016, 100, 241–251. [Google Scholar] [CrossRef] [PubMed]
  54. Padua, L.; Vanko, J.; Hruska, J.; Adao, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, Sensors, and Data Processing in Agroforestry: A Review towards Practical Applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  55. Adão, T.; Hruska, J.; Padua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
  56. Saari, H.; Polonen, I.; Salo, H.; Honkavaara, E.; Hakala, T.; Holmlund, C.; Mäkynen, J.; Mannila, R.; Antila, T.; Akujarvi, A. Miniaturized Hyperspectral Imager Calibration and UAV Flight Campaigns. In Proceedings of the Sensors, Systems, and Next-Generation Satellites XVII, Dresden, Germany, 23–26 September 2013; Volume 8889. [Google Scholar] [CrossRef]
  57. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D Hyperspectral Information with Lightweight UAV Snapshot Cameras for Vegetation Monitoring: From Camera Calibration to Quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  58. Gómez-Candón, D.; Torres-Sanchez, J.; Labbé, S.; Jolivot, A.; Martinez, S.; Regnard, J. Water Stress assessment at Tree Scale: High-resolution Thermal UAV Imagery Acquisition and Processing. Acta Hortic. 2017, 1150, 159–166. [Google Scholar]
  59. Ribeiro-Gomes, K.; Hernandez-Lopez, D.; Ortega, J.F.; Ballesteros, R.; Poblete, T.; Moreno, M.A. Uncooled Thermal Camera Calibration and Optimization of The Photogrammetry Process for UAV Applications in Agriculture. Sensors 2017, 17, 2173. [Google Scholar] [CrossRef] [PubMed]
  60. Maes, W.H.; Huete, A.R.; Steppe, K. Optimizing The Processing of UAV-Based Thermal Imagery. Remote Sens. 2017, 9, 476. [Google Scholar] [CrossRef]
  61. Gonzalez-Dugo, V.; Zarco-Tejada, P.; Fereres, E. Applicability and Limitations of Using the Crop Water Stress Index as An Indicator of Water Deficits In Citrus Orchards. Agric. For. Meteorol. 2014, 198–199, 94–104. [Google Scholar] [CrossRef]
  62. Balota, M.; Oakes, J. UAV Remote Sensing for Phenotyping Drought tolerance In Peanuts. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II; Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series; International Society for Optics and Photonics: Washington, DC, USA, 2017; Volume 10218, p. 102180C. [Google Scholar] [CrossRef]
  63. Bellvert, J.; Zarco-Tejada, P.J.; Girona, J.; Fereres, E. Mapping Crop Water Stress Index In A ‘Pinot-noir’ Vineyard: Comparing Ground Measurements with Thermal Remote Sensing Imagery From An Unmanned Aerial Vehicle. Precis. Agric. 2014, 15, 361–376. [Google Scholar] [CrossRef]
  64. Berni, J.A.J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and Narrowband Multispectral Remote Sensing for Vegetation Monitoring From An Unmanned Aerial Vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
  65. Calderón, R.; Navas-Cortés, J.; Lucena, C.; Zarco-Tejada, P. High-resolution Airborne Hyperspectral and Thermal Imagery for Early Detection of Verticillium Wilt of Olive Using Fluorescence, Temperature and Narrow-band Spectral Indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  66. Ciezkowski, W.; Jozwiak, J.; Szporak-Wasilewska, S.; Dabrowski, P.; Kleniewska, M.; Goraj, M.; Chormanski, J. Water Stress Index for Bogs and Mires Based on UAV Land Surface Measuremnts and Its Dependency on Airborne Hyperespectral Data. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 23–27 July 2018; pp. 9257–9260. [Google Scholar]
  67. Delalieux, S.; Zarco-Tejada, P.J.; Tits, L.; á. J. Bello, M.; Intrigliolo, D.S.; Somers, B. Unmixing-Based Fusion of Hyperspatial and Hyperspectral Airborne Imagery for Early Detection of Vegetation Stress. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2571–2582. [Google Scholar] [CrossRef]
  68. Espinoza, C.Z.; Khot, L.R.; Sankaran, S.; Jacoby, P.W. High Resolution Multispectral and Thermal Remote Sensing-Based Water Stress assessment In Subsurface Irrigated Grapevines. Remote Sens. 2017, 9, 961. [Google Scholar] [CrossRef]
  69. Gonzalez-Dugo, V.; Zarco-Tejada, P.; Nicolás, E.; Nortes, P.A.; Alarcón, J.J.; Intrigliolo, D.S.; Fereres, E. Using High Resolution UAV Thermal Imagery to assess The Variability In The Water Status of Five Fruit Tree Species within A Commercial Orchard. Precis. Agric. 2013, 14, 660–678. [Google Scholar] [CrossRef]
  70. Hoffmann, H.; Jensen, R.; Thomsen, A.; Nieto, H.; Rasmussen, J.; Friborg, T. Crop Water Stress Maps for An Entire Growing Season From Visible and Thermal UAV Imagery. Biogeosciences 2016, 13, 6545–6563. [Google Scholar] [CrossRef]
  71. Katsigiannis, P.; Misopolinos, L.; Liakopoulos, V.; Alexandridis, T.K.; Zalidis, G. An Autonomous Multi-sensor UAV System for Reduced-input Precision Agriculture Applications. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016; pp. 60–64. [Google Scholar] [CrossRef]
  72. Ludovisi, R.; Tauro, F.; Salvati, R.; Khoury, S.; Mugnozza Scarascia, G.; Harfouche, A. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought. Front. Plant Sci. 2017, 8, 1681. [Google Scholar] [CrossRef]
  73. Park, S.; Nolan, A.; Ryu, D.; Fuentes, S.; Hernandez, E.; Chung, H.; O’Connell, M. Estimation of Crop Water Stress in A Nectarine Orchard Using High-resolution Imagery from Unmanned Aerial Vehicle (UAV). In Proceedings of the MODSIM2015—21st International Congress on Modelling and Simulation, Gold Coast, Australia, 29 November–4 December 2015; Weber, T., McPhee, M., Anderssen, R., Eds.; Modelling and Simulation Society of Australia and New Zealand: Canberra, Australia, 2015; pp. 1413–1419. [Google Scholar]
  74. Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive Estimation of Crop Water Stress In Nectarine and Peach Orchards Using High-Resolution Imagery From An Unmanned Aerial Vehicle (UAV). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef]
  75. Poblete-Echeverría, C.; Sepulveda-Reyes, D.; Ortega-Farias, S.; Zuñiga, M.; Fuentes, S. Plant Water Stress Detection Based on Aerial and Terrestrial Infrared Thermography: A Study Case From Vineyard and Olive Orchard. Acta Hortic. 2016, 1112, 141–146. [Google Scholar] [CrossRef]
  76. Poblete, T.; Ortega-Farias, S.; Moreno, M.A.; Bardeen, M. Artificial Neural Network to Predict Vine Water Status Spatial Variability Using Multispectral Information Obtained from An Unmanned Aerial Vehicle (UAV). Sensors 2017, 17, 2488. [Google Scholar] [CrossRef] [PubMed]
  77. Poblete, T.; Ortega-Farias, S.; Ryu, D. Automatic Coregistration Algorithm to Remove Canopy Shaded Pixels in UAV-Borne Thermal Images to Improve The Estimation of Crop Water Stress Index of A Drip-Irrigated Cabernet Sauvignon Vineyard. Sensors 2018, 18, 397. [Google Scholar] [CrossRef]
  78. Romero, M.; Luo, Y.; Su, B.; Fuentes, S. Vineyard Water Status Estimation Using Multispectral Imagery from an UAV Platform and Machine Learning Algorithms for Irrigation Scheduling Management. Comput. Electron. Agric. 2018, 147, 109–117. [Google Scholar] [CrossRef]
  79. Santesteban, L.; Gennaro, S.D.; Herrero-Langreo, A.; Miranda, C.; Royo, J.; Matese, A. High-resolution UAV-based Thermal Imaging to Estimate The Instantaneous and Seasonal Variability of Plant Water Status within A Vineyard. Agric. Water Manag. 2017, 183, 49–59. [Google Scholar] [CrossRef]
  80. Soubry, I.; Patias, P.; Tsioukas, V. Monitoring Vineyards with UAV and Multi-sensors for the assessment of Water Stress and Grape Maturity. J. Unmanned Veh. Syst. 2017, 5, 37–50. [Google Scholar] [CrossRef]
  81. Stagakis, S.; Gonzalez-Dugo, V.; Cid, P.; Guillen-Climent, M.; Zarco-Tejada, P. Monitoring Water Stress and Fruit Quality In An Orange Orchard Under Regulated Deficit Irrigation Using Narrow-band Structural and Physiological Remote Sensing Indices. ISPRS J. Photogramm. Remote Sens. 2012, 71, 47–61. [Google Scholar] [CrossRef]
  82. Suarez, L.; Zarco-Tejada, P.; Gonzalez-Dugo, V.; Berni, J.; Sagardoy, R.; Morales, F.; Fereres, E. Detecting Water Stress Effects on Fruit Quality In Orchards with Time-series PRI Airborne Imagery. Remote Sens. Environ. 2010, 114, 286–298. [Google Scholar] [CrossRef]
  83. Sullivan, D.G.; Fulton, J.P.; Shaw, J.N.; Bland, G. Evaluating The Sensitivity of An Unmanned Thermal Infrared Aerial System to Detect Water Stress In A Cotton Canopy. Proc. ASABE 2007, 50, 1955–1962. [Google Scholar] [CrossRef]
  84. Zarco-Tejada, P.; Berni, J.; Suarez, L.; Sepulcre-Canto, G.; Morales, F.; Miller, J. Imaging Chlorophyll Fluorescence with An Airborne Narrow-band Multispectral Camera for Vegetation Stress Detection. Remote Sens. Environ. 2009, 113, 1262–1275. [Google Scholar] [CrossRef]
  85. Zarco-Tejada, P.; Gonzalez-Dugo, V.; Berni, J. Fluorescence, Temperature and Narrow-band Indices Acquired From A UAV Platform for Water Stress Detection Using A Micro-hyperspectral Imager and A Thermal Camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  86. Zarco-Tejada, P.; González-Dugo, V.; Williams, L.; Suárez, L.; Berni, J.; Goldhamer, D.; Fereres, E. A PRI-based Water Stress Index Combining Structural and Chlorophyll Effects: Assessment Using Diurnal Narrow-band Airborne Imagery and The CWSI Thermal Index. Remote Sens. Environ. 2013, 138, 38–50. [Google Scholar] [CrossRef]
  87. Zhao, T.; Stark, B.; Chen, Y.; Ray, A.L.; Doll, D. A Detailed Field Study of Direct Correlations Between Ground Truth Crop Water Stress and Normalized Difference Vegetation Index (NDVI) From Small Unmanned Aerial System (sUAS). In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 520–525. [Google Scholar] [CrossRef]
  88. Zhao, T.; Stark, B.; Chen, Y.; Ray, A.L.; Doll, D. Challenges In Water Stress Quantification Using Small Unmanned Aerial System (sUAS): Lessons From A Growing Season of Almond. J. Intell. Robot. Syst. 2017, 88, 721–735. [Google Scholar] [CrossRef]
  89. Zhao, T.; Doll, D.; Wang, D.; Chen, Y. A New Framework for UAV-based Remote Sensing Data Processing and Its Application in Almond Water Stress Quantification. In Proceedings of the 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 13–16 June 2017; pp. 1794–1799. [Google Scholar] [CrossRef]
  90. Graeff, S.; Pfenning, J.; Claupein, W.; Liebig, H.P. Evaluation of Image Analysis to Determine The N-Fertilizer Demand of Broccoli Plants (Brassica Oleracea Convar. Botrytis Var. Italica). Adv. Opt. Technol. 2008, 2008, 359760. [Google Scholar]
  91. Dezordi, L.R.; Aquino, L.A.; Aquino, R.F.B.A.; Clemente, J.M.; Assunção, N. Diagnostic Methods to assess the Nutritional Status of The Carrot Crop. Rev. Bras. De Ciência Do Solo 2016, 40, e0140813. [Google Scholar] [CrossRef]
  92. Balasubramaniam, P.; Ananthi, V.P. Segmentation of Nutrient Deficiency In Incomplete Crop Images Using Intuitionistic Fuzzy C-means Clustering Algorithm. Nonlinear Dyn. 2016, 83, 849–866. [Google Scholar] [CrossRef]
  93. Jia, L.; Chen, X.; Zhang, F.; Buerkert, A.; Römheld, V. Use of Digital Camera to assess Nitrogen Status of Winter Wheat In The Northern China Plain. J. Plant Nutr. 2004, 27, 441–450. [Google Scholar] [CrossRef]
  94. Nauš, J.; Prokopová, J.; Řebíček, J.; Špundová, M. SPAD Chlorophyll Meter Reading Can Be Pronouncedly Affected By Chloroplast Movement. Photosynth. Res. 2010, 105, 265–271. [Google Scholar] [CrossRef]
  95. Ali, M.M.; Al-Ani, A.; Eamus, D.; Tan, D.K.Y. Leaf Nitrogen Determination Using Non-Destructive Techniques—A Review. J. Plant Nutr. 2017, 40, 928–953. [Google Scholar] [CrossRef]
  96. Meggio, F.; Zarco-Tejada, P.; Núñez, L.; Sepulcre-Cantó, G.; González, M.; Martín, P. Grape Quality assessment In Vineyards Affected By Iron Deficiency Chlorosis Using Narrow-band Physiological Remote Sensing Indices. Remote Sens. Environ. 2010, 114, 1968–1986. [Google Scholar] [CrossRef]
  97. Sims, N.C.; Culvenor, D.; Newnham, G.; Coops, N.C.; Hopmans, P. Towards The Operational Use of Satellite Hyperspectral Image Data for Mapping Nutrient Status and Fertilizer Requirements In Australian Plantation forests. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 320–328. [Google Scholar] [CrossRef]
  98. Huang, S.; Miao, Y.; Zhao, G.; Yuan, F.; Ma, X.; Tan, C.; Yu, W.; Gnyp, M.L.; Lenz-Wiedemann, V.I.S.; Rascher, U.; et al. Satellite Remote Sensing-based In-season Diagnosis of Rice Nitrogen Status In Northeast China. Remote Sens. 2015, 7, 10646–10667. [Google Scholar] [CrossRef]
  99. Congalton, R.G.; Gu, J.; Yadav, K.; Thenkabail, P.; Ozdogan, M. Global Land Cover Mapping: A Review and Uncertainty Analysis. Remote Sens. 2014, 6, 12070–12093. [Google Scholar] [CrossRef]
  100. Agüera, F.; Carvajal, F.; Pérez, M. Measuring Sunflower Nitrogen Status From An Unmanned Aerial Vehicle-Based System and An on The Ground Device. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, 3822, 33–37. [Google Scholar] [CrossRef]
  101. Ballester, C.; Hornbuckle, J.; Brinkhoff, J.; Smith, J.; Quayle, W. Assessment of In-Season Cotton Nitrogen Status and Lint Yield Prediction From Unmanned Aerial System Imagery. Remote Sens. 2017, 9, 1149. [Google Scholar] [CrossRef]
  102. Benincasa, P.; Antognelli, S.; Brunetti, L.; Fabbri, C.A.; Natale, A.; Sartoretti, V.; Modeo, G.; Guiducci, M.; Tei, F.; Vizzari, M. Reliability of NDVI Derived By High Resolution Satellite and UAV Compared to In-field Methods for The Evaluation of Early Crop N Status and Grain Yield In Wheat. Exp. Agric. 2018, 54, 604–622. [Google Scholar] [CrossRef]
  103. Capolupo, A.; Kooistra, L.; Berendonk, C.; Boccia, L.; Suomalainen, J. Estimating Plant Traits of Grasslands From UAV-Acquired Hyperspectral Images: A Comparison of Statistical Approaches. ISPRS Int. J. Geo-Inf. 2015, 4, 2792–2820. [Google Scholar] [CrossRef]
  104. Caturegli, L.; Corniglia, M.; Gaetani, M.; Grossi, N.; Magni, S.; Migliazzi, M.; Angelini, L.; Mazzoncini, M.; Silvestri, N.; Fontanelli, M.; Raffaelli, M.; Peruzzi, A.; Volterrani, M. Unmanned Aerial Vehicle to Estimate Nitrogen Status of Turfgrasses. PLoS ONE 2016, 11, 1–13. [Google Scholar] [CrossRef]
  105. Corti, M.; Cavalli, D.; Cabassi, G.; Vigoni, A.; Degano, L.; Marino Gallina, P. Application of A Low-cost Camera on A UAV to Estimate Maize Nitrogen-related Variables. Precis. Agric. 2018. [Google Scholar] [CrossRef]
  106. Felderhof, L.; Gillieson, D. Near-infrared Imagery From Unmanned Aerial Systems and Satellites Can Be Used to Specify Fertilizer Application Rates In Tree Crops. Can. J. Remote Sens. 2011, 37, 376–386. [Google Scholar] [CrossRef]
  107. Geipel, J.; Link, J.; Wirwahn, J.A.; Claupein, W. A Programmable Aerial Multispectral Camera System for In-Season Crop Biomass and Nitrogen Content Estimation. Agriculture 2016, 6, 4. [Google Scholar] [CrossRef]
  108. Hunt, E.R., Jr.; Cavigelli, M.; Daughtry, C.S.T.; Mcmurtrey, J.E.; Walthall, C.L. Evaluation of Digital Photography From Model Aircraft for Remote Sensing of Crop Biomass and Nitrogen Status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  109. Hunt, E.R., Jr.; Rondon, S.I.; Hamm, P.B.; Turner, R.W.; Bruce, A.E.; Brungardt, J.J. Insect Detection and Nitrogen Management for Irrigated Potatoes Using Remote Sensing From Small Unmanned Aircraft Systems. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping; International Society for Optics and Photonics: Washington, DC, USA, 2016; Volume 9866. [Google Scholar] [CrossRef]
  110. Hunt, E.R.; Horneck, D.A.; Spinelli, C.B.; Turner, R.W.; Bruce, A.E.; Gadler, D.J.; Brungardt, J.J.; Hamm, P.B. Monitoring Nitrogen Status of Potatoes Using Small Unmanned Aerial Vehicles. Precis. Agric. 2018, 19, 314–333. [Google Scholar] [CrossRef]
  111. Jay, S.; Baret, F.; Dutartre, D.; Malatesta, G.; Héno, S.; Comar, A.; Weiss, M.; Maupas, F. Exploiting The Centimeter Resolution of UAV Multispectral Imagery to Improve Remote-sensing Estimates of Canopy Structure and Biochemistry In Sugar Beet Crops. Remote Sens. Environ. 2018. [Google Scholar] [CrossRef]
  112. Lelong, C.C.D.; Burger, P.; Jubelin, G.; Roux, B.; Baret, S.L.F. Assessment of Unmanned Aerial Vehicles Imagery for Quantitative Monitoring of Wheat Crop In Small Plots. Sensors 2008, 8, 3557–3585. [Google Scholar] [CrossRef]
  113. Li, J.; Zhang, F.; Qian, X.; Zhu, Y.; Shen, G. Quantification of Rice Canopy Nitrogen Balance Index with Digital Imagery From Unmanned Aerial Vehicle. Remote Sens. Lett. 2015, 6, 183–189. [Google Scholar] [CrossRef]
  114. Liu, H.; Zhu, H.; Wang, P. Quantitative Modelling for Leaf Nitrogen Content of Winter Wheat Using UAV-based Hyperspectral Data. Int. J. Remote Sens. 2017, 38, 2117–2134. [Google Scholar] [CrossRef]
  115. Liu, S.; Li, L.; Gao, W.; Zhang, Y.; Liu, Y.; Wang, S.; Lu, J. Diagnosis of Nitrogen Status In Winter Oilseed Rape (Brassica napus L.) Using In-situ Hyperspectral Data and Unmanned Aerial Vehicle (UAV) Multispectral Images. Comput. Electron. Agric. 2018, 151, 185–195. [Google Scholar] [CrossRef]
  116. Van Der Meij, B.; Kooistra, L.; Suomalainen, J.; Barel, J.M.; De Deyn, G.B. Remote Sensing of Plant Trait Responses to Field-based Plant–soil Feedback Using UAV-based Optical Sensors. Biogeosciences 2017, 14, 733–749. [Google Scholar] [CrossRef]
  117. Saberioon, M.M.; Gholizadeh, A. Novel Approach for Estimating Nitrogen Contant In Paddy Fialds Using Low Altitude Remote Sensing System. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 1011–1015. [Google Scholar] [CrossRef]
  118. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.H. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef]
  119. Severtson, D.; Callow, N.; Flower, K.; Neuhaus, A.; Olejnik, M.; Nansen, C. Unmanned Aerial Vehicle Canopy Reflectance Data Detects Potassium Deficiency and Green Peach Aphid Susceptibility In Canola. Precis. Agric. 2016, 17, 659–677. [Google Scholar] [CrossRef]
  120. Swain, K.C.; Jayasuriya, H.P.W.; Salokhe, V.M. Suitability of Low-altitude Remote Sensing Images for Estimating Nitrogen Treatment Variations in Rice Cropping for Precision Agriculture Adoption. J. Appl. Remote Sens. 2007, 1, 1–11. [Google Scholar] [CrossRef]
  121. Vega, F.A.; Ramírez, F.C.; Saiz, M.P.; Rosúa, F.O. Multi-temporal Imaging Using An Unmanned Aerial Vehicle for Monitoring A Sunflower Crop. Biosyst. Eng. 2015, 132, 19–27. [Google Scholar] [CrossRef]
  122. Du, W.; Xu, T.; Yu, F.; Chen, C. Measurement of Nitrogen Content in Rice by Inversion of Hyperspectral Reflectance Data from an Unmanned Aerial Vehicle. Ciência Rural. 2018, 48. [Google Scholar]
  123. Yakushev, V.P.; Kanash, E.V. Evaluation of Wheat Nitrogen Status By Colorimetric Characteristics of Crop Canopy Presented In Digital Images. J. Agric. Inform. 2016, 7, 65–74. [Google Scholar]
  124. Zermas, D.; Teng, D.; Stanitsas, P.; Bazakos, M.; Kaiser, D.; Morellas, V.; Mulla, D.; Papanikolopoulos, N. Automation Solutions for the Evaluation of Plant Health in Corn Fields. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 6521–6527. [Google Scholar] [CrossRef]
  125. Zheng, H.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Evaluation of RGB, Color-Infrared and Multispectral Images Acquired From Unmanned Aerial Systems for The Estimation of Nitrogen Accumulation in Rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef]
  126. Zhu, J.; Wang, K.; Deng, J.; Harmon, T. Quantifying Nitrogen Status of Rice Using Low Altitude UAV-Mounted System and Object-Oriented Segmentation Methodology. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, San Diego, CA, USA, 30 August–2 September 2009; pp. 1–7. [Google Scholar]
  127. Zhu, H.; Liu, H.; Xu, Y.; Guijun, Y. UAV-based Hyperspectral Analysis and Spectral Indices Constructing for Quantitatively Monitoring Leaf Nitrogen Content of Winter Wheat. Appl. Opt. 2018, 57, 7722–7732. [Google Scholar] [CrossRef]
  128. Chang, S.X.; Robison, D.J. Nondestructive and Rapid Estimation of Hardwood Foliar Nitrogen Status Using the SPAD-502 Chlorophyll Meter. For. Ecol. Manag. 2003, 181, 331–338. [Google Scholar] [CrossRef]
  129. Bock, C.H.; Poole, G.H.; Parker, P.E.; Gottwald, T.R. Plant Disease Severity Estimated Visually, by Digital Photography and Image Analysis, and By Hyperspectral Imaging. Crit. Rev. Plant Sci. 2010, 29, 59–107. [Google Scholar] [CrossRef]
  130. Barbedo, J.G.A. Factors Influencing the Use of Deep Learning for Plant Disease Recognition. Biosyst. Eng. 2018, 172, 84–91. [Google Scholar] [CrossRef]
  131. Altas, Z.; Ozguven, M.M.; Yanar, Y. Determination of Sugar Beet Leaf Spot Disease Level (Cercospora Beticola Sacc.) with Image Processing Technique By Using Drone. Curr. Investig. Agric. Curr. Res. 2018, 5, 621–631. [Google Scholar] [CrossRef]
  132. Dang, L.M.; Hassan, S.I.; Suhyeon, I.; Sangaiah, A.K.; Mehmood, I.; Rho, S.; Seo, S.; Moon, H. UAV Based Wilt Detection System Via Convolutional Neural Networks. Sustain. Comput. Inform. Syst. 2018. [Google Scholar] [CrossRef]
  133. Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing Very High Resolution UAV Imagery for Monitoring forest Health During A Simulated Disease Outbreak. ISPRS J. Photogramm. Remote Sens. 2017, 131, 1–14. [Google Scholar] [CrossRef]
  134. Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV Multispectral Imagery Can Complement Satellite Data for Monitoring forest Health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef]
  135. Kerkech, M.; Hafiane, A.; Canals, R. Deep Leaning Approach with Colorimetric Spaces and Vegetation Indices for Vine Diseases Detection In UAV Images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  136. Sarkar, S.K.; Das, J.; Ehsani, R.; Kumar, V. Towards Autonomous Phytopathology: Outcomes and Challenges of Citrus Greening Disease Detection Through Close-range Remote Sensing. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2016; pp. 5143–5148. [Google Scholar] [CrossRef]
  137. Su, J.; Liu, C.; Coombes, M.; Hu, X.; Wang, C.; Xu, X.; Li, Q.; Guo, L.; Chen, W.H. Wheat Yellow Rust Monitoring by Learning from Multispectral UAV Aerial Imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
  138. Sugiura, R.; Tsuda, S.; Tsuji, H.; Murakami, N. Virus-Infected Plant Detection in Potato Seed Production Field By UAV Imagery. In Proceedings of the 2018 asABE Annual International Meeting, Detroit, MI, USA, 29 July–1 August 2018; p. 1800594. [Google Scholar] [CrossRef]
  139. Tetila, E.C.; Machado, B.B.; Belete, N.A.S.; Guimarães, D.A.; Pistori, H. Identification of Soybean Foliar Diseases Using Unmanned Aerial Vehicle Images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
  140. Balota, M.; Oakes, J. Exploratory Use of A UAV Platform for Variety Selection In Peanut. Proc. SPIE 2016, 9866. [Google Scholar] [CrossRef]
  141. Gibson-Poole, S.; Humphris, S.; Toth, I.; Hamilton, A. Identification of the onset of Disease within a Potato Crop Using a UAV Equipped with Un-modified and Modified Commercial off-the-shelf Digital Cameras. Adv. Anim. Biosci. 2017, 8, 812–816. [Google Scholar] [CrossRef]
  142. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field Phenotyping System for the assessment of Potato Late Blight Resistance Using RGB Imagery from an Unmanned Aerial Vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  143. Zhang, D.; Zhou, X.; Zhang, J.; Huang, L.; Zhao, J. Developing A Small UAV Platform to Detect Sheath Blight of Rice. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 3190–3193. [Google Scholar] [CrossRef]
  144. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.B.; Dedieu, G. Detection of Flavescence Dorée Grapevine Disease Using Unmanned Aerial Vehicle (UAV) Multispectral Imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef]
  145. Albetis, J.; Jacquin, A.; Goulard, M.; Poilvé, H.; Rousseau, J.; Clenet, H.; Dedieu, G.; Duthoit, S. On the Potentiality of UAV Multispectral Imagery to Detect Flavescence Dorée and Grapevine Trunk Diseases. Remote Sens. 2018, 11, 23. [Google Scholar] [CrossRef]
  146. Al-Saddik, H.; Simon, J.C.; Brousse, O.; Cointault, F. DAMAV Project for Vineyard Disease Detection by UAV Imagery. In Proceedings of the International Conference on Agricultural Engineering, Automation, Environment and Food Safety, Aarhus, Denmark, 26–29 June 2016; pp. 1–7. [Google Scholar]
  147. Calderón, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortés, J.A.; Zarco-Tejada, P.J. Detection of Downy Mildew of Opium Poppy Using High-resolution Multi-spectral and Thermal Imagery Acquired with an Unmanned Aerial Vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  148. Gennaro, S.D.; Battiston, E.; Marco, S.D.; Facini, O.; Matese, A.; Nocentini, M.; Palliotti, A.; Mugnai, L. Unmanned Aerial Vehicle (UAV)—Based Remote Sensing to Monitor Grapevine Leaf Stripe Disease within a Vineyard Affected By Esca Complex. Phytopathol. Mediterr. 2016, 55, 262–275. [Google Scholar]
  149. Khot, L.R.; Sankaran, S.; Carter, A.H.; Johnson, D.A.; Cummings, T.F. UAS Imaging-based Decision tools for Arid Winter Wheat and Irrigated Potato Production Management. Int. J. Remote Sens. 2016, 37, 125–137. [Google Scholar] [CrossRef]
  150. Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S. Light-Weight Multispectral Uav Sensors and Their Capabilities for Predicting Grain Yield and Detecting Plant Diseases. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 963–970. [Google Scholar] [CrossRef]
  151. Patrick, A.; Pelham, S.; Culbreath, A.; Holbrook, C.C.; De Godoy, I.J.; Li, C. High Throughput Phenotyping of tomato Spot Wilt Disease In Peanuts Using Unmanned Aerial Systems and Multispectral Imaging. IEEE Instrum. Meas. Mag. 2017, 20, 4–12. [Google Scholar] [CrossRef]
  152. Calderón, R.; Navas-Cortés, J.; Lucena, C.; Zarco-Tejada, P. High-resolution Hyperspectral and Thermal Imagery Acquired From UAV Platforms for Early Detection of Verticillium Wilt Using Fluorescence, Temperature and Narrow-band Indices. In Proceedings of the Workshop on UAV-basaed Remote Sensing Methods for Monitoring Vegetation, Cologne, Germany, 9–10 September 2013; pp. 7–14. [Google Scholar] [CrossRef]
  153. Smigaj, M.; Gaulton, R.; Barr, S.L.; Suárez, J.C. Uav-Borne Thermal Imaging for forest Health Monitoring: Detection of Disease-Induced Canopy Temperature Increase. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 349–354. [Google Scholar] [CrossRef]
  154. Barbedo, J.G.A. A Review on The Main Challenges In Automatic Plant Disease Identification Based on Visible Range Images. Biosyst. Eng. 2016, 144, 52–60. [Google Scholar] [CrossRef]
  155. Sugiyama, M.; Nakajima, S.; Kashima, H.; Buenau, P.V.; Kawanabe, M. Direct Importance Estimation with Model Selection and Its Application to Covariate Shift Adaptation. In Advances In Neural Information Processing Systems 20; Platt, J.C., Koller, D., Singer, Y., Roweis, S.T., Eds.; Curran Associates Inc.: Vancouver, BC, Canada, 2008; pp. 1433–1440. [Google Scholar]
  156. Ben-David, S.; Blitzer, J.; Crammer, K.; Kulesza, A.; Pereira, F.; Vaughan, J.W. A Theory of Learning From Different Domains. Mach. Learn. 2010, 79, 151–175. [Google Scholar] [CrossRef]
  157. Lehmann, J.R.K.; Nieberding, F.; Prinz, T.; Knoth, C. Analysis of Unmanned Aerial System-Based CIR Images in forestry—A New Perspective to Monitor Pest Infestation Levels. Forests 2015, 6, 594–612. [Google Scholar] [CrossRef]
  158. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned Aircraft System-derived Crop Height and Normalized Difference Vegetation Index Metrics for Sorghum Yield and Aphid Stress assessment. J. Appl. Remote Sens. 2017, 11, 1–20. [Google Scholar] [CrossRef]
  159. Zhang, N.; Zhang, X.; Yang, G.; Zhu, C.; Huo, L.; Feng, H. assessment of Defoliation During The Dendrolimus Tabulaeformis Tsai Et Liu Disaster Outbreak Using UAV-based Hyperspectral Images. Remote Sens. Environ. 2018, 217, 323–339. [Google Scholar] [CrossRef]
  160. Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A Novel Methodology for Improving Plant Pest Surveillance In Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef]
  161. Vanegas, F.; Bratanov, D.; Weiss, J.; Powell, K.; Gonzalez, F. Multi and Hyperspectral UAV Remote Sensing: Grapevine Phylloxera Detection In Vineyards. In Proceedings of the 2018 IEEE Aerospace Conference, Big Sky, MT, USA, 4–11 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–9. [Google Scholar] [CrossRef]
  162. Yuan, Y.; Hu, X. Random forest and Objected-Based Classification for forest Pest Extraction From Uav Aerial Imagery. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 1093–1098. [Google Scholar] [CrossRef]
  163. Li, L.; Fan, Y.; Huang, X.; Tian, L. Real-time UAV Weed Scout for Selective Weed Control By Adaptive Robust Control and Machine Learning Algorithm. In Proceedings of the 2016 American Society of Agricultural and Biological Engineers Annual International Meeting, asABE 2016, Orlando, FL, USA, 17–20 July 2016; American Society of Agricultural and Biological Engineers: St. Joseph, MO, USA, 2016. [Google Scholar] [CrossRef]
  164. Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J.C. Potential Uses of Small Unmanned Aircraft Systems (UAS) In Weed Research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
  165. Yano, I.H.; Alves, J.R.; Santiago, W.E.; Mederos, B.J. Identification of Weeds In Sugarcane Fields Through Images Taken By UAV and Random forest Classifier. IFAC-PapersonLine 2016, 49, 415–420. [Google Scholar] [CrossRef]
  166. Huang, Y.; Reddy, K.N.; Fletcher, R.S.; Pennington, D. UAV Low-Altitude Remote Sensing for Precision Weed Management. Weed Technol. 2018, 32, 2–6. [Google Scholar] [CrossRef]
  167. Gennaro, S.F.D.; Matese, A.; Gioli, B.; Toscano, P.; Zaldei, A.; Palliotti, A.; Genesio, L. Multisensor Approach to assess Vineyard Thermal Dynamics Combining High-resolution Unmanned Aerial Vehicle (UAV) Remote Sensing and Wireless Sensor Network (WSN) Proximal Sensing. Sci. Hortic. 2017, 221, 83–87. [Google Scholar] [CrossRef]
  168. Malenovsky, Z.; Lucieer, A.; King, D.H.; Turnbull, J.D.; Robinson, S.A. Unmanned Aircraft System Advances Health Mapping of Fragile Polar Vegetation. Methods Ecol. Evol. 2017, 8, 1842–1857. [Google Scholar] [CrossRef]
  169. Roy, R.; Miller, J. Miniaturization of Image Sensors: The Role of Innovations In Complementary Technologies in Overcoming Technological Trade-offs associated with Product Innovation. J. Eng. Technol. Manag. 2017, 44, 58–69. [Google Scholar] [CrossRef]
Table 1. References dealing with the monitoring of water status in crops.
Table 1. References dealing with the monitoring of water status in crops.
Ref.UAV TypeCropSensorEstimated VariablesReference VariablesModel
[62]RotaryPeanutMultispectralNDVIVisual wilting scoreLinear regression
[63]Fixed wingVineyardThermalCWSILeaf water potentialLinear regression
[64]RotaryPeach orchardMultispectral, thermalPRIStomatal conductanceLinear regression
[65]Fixed wingOlive orchardHyperspectral, thermal T c T a , CSWI, CFStomatal conductanceLinear regression
[58]RotaryApple orchardMultispectral, thermal, RGB T c T a , WDISoil water potentialDirect comparison
[66]RotaryBog and mire vegetationThermalCSWISoil moisture, fAPARQuadratic regression
[67]Fixed wingCitrus orchardHyperspectralPRIWater contentLinear regression
[68]RotaryVineyardMultispectral, thermalNDVI, GNDVI, T c Stomatal conductanceLinear regression
[69]Fixed wingAlmond, apricot, peach, orangeThermal T c T a , CSWIStem water potentialLinear regression
[61]Fixed wingMandarin and orangeThermalNWSB, CSWIStem water potentialLinear regression
[70]Fixed wingBarleyThermal, RGBWDIMeasured stress valuesDirect comparison
[71]RotaryPomegranateMultispectral, thermalCSWIIrrigation dataDirect comparison
[72]RotaryBlack poplarThermalCanopy temperatureStomatal conductanceLinear regression
[13]RotaryVineyardMultispectral, thermal, RGBCWSIStomatal conductanceDirect comparison
[73]RotaryNectarine orchardThermalCWSIStem water potential, stomatal conductanceLinear regression
[74]RotaryNectarine, peachThermalAdaptive CWSIStem water potential, stomatal conductanceLinear regression
[75]?Vineyard, olive orchardThermal T c T a Stem water potentialLinear regression
[76]?VineyardMultispectralNarrow spectral bandsStem water potentialMLP NN
[77]RotaryVineyardMultispectral, thermalCWSIStrem water potentialLinear regression
[78]RotaryVineyardMultispectralVegetation indicesStem water potentialMLP NN
[79]RotaryVineyardThermalCWSIStem water potential, stomatal conductanceLinear regression
[80]Fixed wingVineyardMultispectral, RGB, NIRTCARI/OSAVIBRIXLinear regression
[81]RotaryOrange orchardMultispectralPRIStem water potentialLinear regression
[82]RotaryPeach, nectarine, orangeMultispectralPRIXanthophyll epoxidation stateLinear regression
[83]Fixed wingCottonThermalTIR emitanceSoil water contentLinear regression
[84]RotaryOlive, peachMultispectral, thermalFluorescence (UAV)Fluorescence (ground)Linear regression
[85]Fixed wingCitrus orchardHyperspectral, thermalPRI, VI, T c Stomatal conductance, leaf water potentialLinear regression
[86]Fixed wingVineyardMultispectral, thermalPRIStomatal conductance, leaf water potentialLinear regression
[87]RotaryAlmond orchardMultispectral, RGBNDVIStem water potentialLinear regression
[88]RotaryAlmond orchardMultispectral, RGBNDVIStem water potentialLinear regression
[89]?Almond orchardMultispectralMultispectral bands (PCA)Stem water potentialLinear regression
Legend: ?—The type of aircraft was not made clear in the original reference; BRIX—sucrose measure; CF—Chlorophyll fluorescence; CWSI—Crop water stress index; fAPAR—Fraction of absorbed photosynthetically active radiation; GNDVI—Green normalized difference vegetation index; MLP NN—Multilayer perceptron neural network; NDVI—Normalized difference vegetation index; NIR—Near infrared; NWSB—Non water stress baseline; PCA—Principal component analysis; PRI—Photochemical reflectance index; T a —Air temperature; T c —Canopy temperature; TCARI/OSAVI—Transformed chlorophyll absorption in reflectance index/Optimized soil-adjusted vegetation index; TIR—Thermal infrared; VI—Vegetation index; WDI—Water deficit index.
Table 2. The references dealing with the monitoring of nutrient status in crops.
Table 2. The references dealing with the monitoring of nutrient status in crops.
Ref.UAV TypeCropSensorModel InputModel OutputModel Type
[100]RotarySunflowerMultispectralNDVINitrogen treatmentLinear regression
[101]RotaryCottonMultispectralSeveral VIsNitrogen concentration and uptakeLinear regression
[102]RotaryWheatMultispectralNDVINitrogen concentration and uptakeLinear regression
[103]RotaryGrassHyperspectralAverage reflectance spectraSodium and potassium contentPLS regression
[104]RotaryTurfgrassMultispectralNDVINitrogen contentLinear regression
[105]RotaryCornCIRSeveral vegetation indicesNitrogen concentration and uptakePLS regression
[106]RotaryMacadamiaRGB + NIRCCCI, NDRELeaf nitrogen levelLinear regression
[9]RotaryCornMultispectral, hyperspectralVariety of indicesNitrogen concentrationPolynomial regression
[107]RotaryWheatMultispectralNDVI, REIPNitrogen concentrationLinear regression
[108]Fixed wingCornRGBNGRDINitrogen status (chlorophyll content)LS regression
[109]RotaryPotatoMultispectralNDVI, GNDVIN status (chlorophyll content, LAI)Linear regression
[110]Parafoil-wingPotatoMultispectralNDVI, GNDVIN status (chlorophyll content, LAI)Linear regression
[111]RotarySugar beetMultispectralVIs, green pixel fractionNitrogen concentrationMultilinear regression
[112]Parafoil and fixed wingWheatMultispectralVegetation indicesNitrogen uptakeExponential regression
[113]RotaryRiceRGBDGCINitrogen concentrationLinear regression
[114]RotaryWheatHyperspectralSelected bandsNitrogen concentrationMultilinear regression, MLPNN
[115]?Winter oilseed rapeMultispectralVegetation indicesNitrogen concentrationLinear regression
[26]RotarySoybeanMultispectral, thermal, RGBSpectral indices and featuresNitrogen concentrationPLSR, SVR, ELR
[116]RotaryOatHyperspectralVegetation indicesNitrogen concentrationLinear regression
[117]Fixed wingRiceMultispectralVegetation indicesSPAD (chlorophyll content)Linear regression
[118]RotaryWheatRGBPCs of color featuresNitrogen concentrationLinear regression
[119]RotaryCanolaMultispectral, hyperspectralSelected spectral bandsPotassium deficiency levelDiscriminant analysis
[120]RotaryRiceMultispectralVegetation indicesNitrogen treatmentLinear regression
[121]RotarySunflowerMultispectralNDVINitrogen concentrationLinear regression
[122]RotaryRiceHyperspectralPCs of spectral bandsNitrogen concentrationLinear regression
[123]?WheatRGBColor parametersNitrogen treatmentLinear regression
[124]RotaryCornRGBPixelsNitrogen deficiency levelLogistic regression
[125]RotaryRiceRGB, CIR, multispectralVegetation indicesNitrogen accumulation (leaf and plant)Linear regression
[15]RotaryWheatMultispectralRDVINitrogen concentrationSeveral ML models
[126]RotaryRiceRGBPCs of color featuresNitrogen concentrationQuadratic regression
[127]RotaryWheatHyperspectralSelected bandsNitrogen concentrationMultilinear regression
Legend: ?—The type of aircraft was not made clear in the original reference; CCCI—Canopy chlorophyll content index; CIR—Color infrared; DGCI—Dark green colour index; ELR—Extreme learning regression; GNDVI—Green normalized difference vegetation index; LAI—Leaf area index; LS—Least squares; MLPNN—Multilayer perceptron neural network; NDRE—Normalized difference red edge; NDVI—Normalized difference vegetation index; NGRDI—Normalized green-red difference index; NIR—Near infrared; PC—Principal component; PLSR—Partial least squares regression; RDVI—Renormalized difference vegetation index; REIP—Red-edge inflection point; SPAD—Soil-plant analyses development; SVR—Support vector regression; VI—Vegetation index.
Table 3. The references dealing with the monitoring of diseases in crops.
Table 3. The references dealing with the monitoring of diseases in crops.
Ref.UAV TypeCropDiseaseSensorModel InputModel OutputModel Type
[144]Fixed wingVineyardFlavescence doréeMultispectral20 indices and parametersClassif. healthy and diseasedROC analysis
[145]Fixed wingVineyardFlavescence dorée, grapevine trunkMultispectral24 indices and parametersClassif. healthy and 2 diseasesROC analysis
[146]RotaryVineyardFlavescence doréeMultispectralImage pixelsClassif. healthy and diseasedRBFNN
[131]RotarySugar beetLeaf spotRGBL*a*b* color pixelsDisease severityK-means clustering
[140]RotaryPeanutLate leaf spotRGBHue angle, greener areaLeaf drop (disease indicator)Linear regression
[65]Fixed wingOlive orchardVerticillium wiltHyperspectral, thermal T c T a , CSWI, CFDisease severityANOVA analysis
[152]Fixed wingOlive orchardVerticillium wiltHyperspectral, thermal T c T a , CSWI, CFDisease severityANOVA analysis
[132]RotaryRadishFusarium wiltRGBColor and texture featuresDisease severityCNN
[133]RotaryPinus forestSimulated (herbicide)MultispectralVegetation indicesDisease severityRandom forest
[134]RotaryPinus forestSimulated (herbicide)MultispectralVegetation indicesDisease severityRandom forest
[148]RotaryVineyardGrapevine leaf stripeMultispectralNDVIDisease severityLinear regression
[135]?VineyardN/ARGBVegetation indicesClassif. ground, healthy, diseasedCNN
[149]RotaryPotatoNecrosisMultispectralGNDVIDisease severityLinear regression
[150]Fixed wingPotatoPotato blightMultispectral, NIRNDVIDisease severityVisual inspection
[151]RotaryPeanutstomato spot wiltMultispectralVegetation indicesDisease severityLinear regression
[141]RotaryPotatoBlackleg diseaseRGB, NIRNDVIDisease detectionThresholding
[14]RotaryPaperback tea treesMyrtle rustHyperspectralVegetation indices5-class classificationXGBoost
[136]RotaryCitrusHLBRGBPixelsClassif. healthy and diseasedSVM
[153]Fixed wingScots pineRed band needle blightThermalRaw crown temperatureDisease severityLinear regression
[137]RotaryWheatWheat yellow rustMultispectralVegetation indicesDisease severityRandom forest
[142]RotaryPotatoPotato late blightRGBSeverity indexDisease severityThresholding
[138]RotaryPotatoPotato virus YRGBCropped imagesClassif. healthy and diseasedCNN
[139]RotarySoybeanTarget spot, powdery mildewRGBColor, texture, shape featuresClassif. healthy and 2 diseasesSeveral classifiers
[143]RotaryRiceSheath blightRGB, multispectralNDVIDisease severityLinear regression
Legend: ?—The type of aircraft was not made clear in the original reference; CF—Chlorophyll fluorescence; CNN—Convolutional neural network; CWSI—Crop water stress index; GNDVI—Green normalized difference vegetation index; HLB—Huanglongbing; N/A—Not available; NDVI—Normalized difference vegetation index; NIR—Near infrared; RBFNN—Radial basis function neural network; ROC—Receiver operating characteristic; SVM—Support vector machine; T a —Air temperature; T c —Canopy temperature.
Table 4. The references dealing with the monitoring of pests in crops.
Table 4. The references dealing with the monitoring of pests in crops.
Ref.UAV TypeCropPestSensorModel InputModel OutputModel Type
[109]RotaryPotatoColorado potato beetleMultispectralNDVI, GNDVIDamage detectionLinear regression
[157]RotaryOakOak splendour beetleMultispectral (CIR)NDVIDamage quantificationPCA
[150]Fixed wingonionThripsMultispectral, NIRNDVIDamage detectionVisual inspection
[16]RotaryPine forestPine processionary mothRGBMoisture stress indexDamage quantificationLogistic regression
[119]RotaryCanolaGreen peach aphidMultispectral, hyperspectralNDVIPotassium content (indirect)Discriminant analysis
[158]Fixed wingSorghumSugarcane aphidMultispectralNDVIAphid densityLinear regression
[160]RotaryVineyardGrapes PhylloxeraMultispectral, hyperspectral, RGBVegetation indices, DVMPlant vigorLinear regression
[161]RotaryVineyardGrapes PhylloxeraMultispectral, hyperspectral, RGBVegetation indices, DVMPlant vigorLinear regression
[162]?forestN/ARGBTexture featuresDamage detectionRandom forest
[159]RotaryChinese pineChinese pine caterpillarHyperspectralSelected bandsDefoliation quantificationPiecewise PLSR
Legend: ?—The type of aircraft was not made clear in the original reference; CIR—Color infrared; DVM—Digital vigor model; GNDVI—Green normalized difference vegetation index; N/A—Not available; NDVI—Normalized difference vegetation index; NIR—Near infrared; PCA—Principal component analysis; PLSR—Partial least squares regression.
Table 5. The references dealing with the monitoring of weeds in crops.
Table 5. The references dealing with the monitoring of weeds in crops.
Ref.UAV TypeCropWeedSensorModel InputModel OutputModel Type
[166]Rotary, fixed wingSoybeanPalmer amaranth, Italian ryegrassHyperspectralImagesWeed detectionVisual inspection
[163]RotaryN/AMorningglory, cocklebur, Palmer amaranth, waterhempRGBCropped imagesClassif. four weed speciesCNN
[164]RotaryBarleyN/ARGBExcess green indexWeed harrowing impactLinear regression
[17]Rotary, fixed wingSorghumPalmer amaranth, barnyardgrass, Texas panicum, morninggloryRGBExcess green indexWeed infestationK-means
[165]RotarySugarcaneTridax daisy, sourgrassRGBStatistical image descriptorsCrop and weed classificationRandom forest
Table 6. The references dealing with the monitoring of weeds in crops.
Table 6. The references dealing with the monitoring of weeds in crops.
Ref.UAV TypeCropSensorModel InputModel OutputModel Type
[167]RotaryVineyardMultispectral, thermalThermal data, NDVIHeat stress quantificationLinear regression
[168]RotaryMossHyperspectralSpectral bandsChlorophyll content, leaf densitySupport vetor regression

Share and Cite

MDPI and ACS Style

Barbedo, J.G.A. A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses. Drones 2019, 3, 40. https://0-doi-org.brum.beds.ac.uk/10.3390/drones3020040

AMA Style

Barbedo JGA. A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses. Drones. 2019; 3(2):40. https://0-doi-org.brum.beds.ac.uk/10.3390/drones3020040

Chicago/Turabian Style

Barbedo, Jayme Garcia Arnal. 2019. "A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses" Drones 3, no. 2: 40. https://0-doi-org.brum.beds.ac.uk/10.3390/drones3020040

Article Metrics

Back to TopTop