Next Article in Journal
Glidar: An OpenGL-based, Real-Time, and Open Source 3D Sensor Simulator for Testing Computer Vision Algorithms
Next Article in Special Issue
Using Deep Learning to Challenge Safety Standard for Highly Autonomous Machines in Agriculture
Previous Article in Journal
Acknowledgement to Reviewers of Journal of Imaging in 2015
Previous Article in Special Issue
Non-Parametric Retrieval of Aboveground Biomass in Siberian Boreal Forests with ALOS PALSAR Interferometric Coherence and Backscatter Intensity
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Imaging for High-Throughput Phenotyping in Energy Sorghum

Department of Biological & Agricultural Engineering, Texas A&M University, 2117 TAMU, College Station, TX 77843-2117, USA
*
Author to whom correspondence should be addressed.
Submission received: 3 November 2015 / Revised: 15 January 2016 / Accepted: 15 January 2016 / Published: 26 January 2016
(This article belongs to the Special Issue Image Processing in Agriculture and Forestry)

Abstract

:
The increasing energy demand in recent years has resulted in a continuous growing interest in renewable energy sources, such as efficient and high-yielding energy crops. Energy sorghum is a crop that has shown great potential in this area, but needs further improvement. Plant phenotyping—measuring physiological characteristics of plants—is a laborious and time-consuming task, but it is essential for crop breeders as they attempt to improve a crop. The development of high-throughput phenotyping (HTP)—the use of autonomous sensing systems to rapidly measure plant characteristics—offers great potential for vastly expanding the number of types of a given crop plant surveyed. HTP can thus enable much more rapid progress in crop improvement through the inclusion of more genetic variability. For energy sorghum, stalk thickness is a critically important phenotype, as the stalk contains most of the biomass. Imaging is an excellent candidate for certain phenotypic measurements, as it can simulate visual observations. The aim of this study was to evaluate image analysis techniques involving K-means clustering and minimum-distance classification for use on red-green-blue (RGB) images of sorghum plants as a means to measure stalk thickness. Additionally, a depth camera integrated with the RGB camera was tested for the accuracy of distance measurements between camera and plant. Eight plants were imaged on six dates through the growing season, and image segmentation, classification and stalk thickness measurement were performed. While accuracy levels with both image analysis techniques needed improvement, both showed promise as tools for HTP in sorghum. The average error for K-means with supervised stalk measurement was 10.7% after removal of known outliers.

Graphical Abstract

1. Introduction

In this age of increasing concerns about population growth, energy demands and environmental risks, the need to expand energy resources in a sustainable manner is intensifying. Non-renewable energy from fossil fuels will be insufficient at a certain point in the future [1,2]. Additionally, carbon dioxide emissions from fossil fuels have led to concerns about rising levels of greenhouse gases [3], increasing the importance of developing alternative energy sources, like dedicated energy crops that reduce net carbon dioxide emissions. One solution to these issues is to increase energy crop production by maximizing crop output per unit area.
Improving crop productivity is possible through improvements in breeding and genetics. Historically, breeders and geneticists relied on manual labor to evaluate crop physical traits (phenotypes) on a relatively small number of plant varieties. The advent of sensing and robotics has provided the potential to drastically accelerate the advances of breeding and genetics through “high-throughput phenotyping” (HTP). This new research field involves using autonomous technologies to rapidly and accurately measure phenotypes on a large number of plants, enabling the selection of promising varieties and genes from a broader genetic pool to make faster strides in crop improvement. With dedicated energy crops, the most important phenotypes are biomass yield, rate of growth and resistance to typical crop stresses.
Certain crops like sorghum (Sorghum bicolor) have been demonstrated to have excellent potential for dedicated bioenergy production. Sorghum is a C4 plant with a high biomass accumulation rate, and it can be economically produced and harvested in four months or less with suitable environmental conditions [3,4,5]. While some varieties of sorghum are grown mainly for grain (grain sorghum), others are grown mainly for the sugars they produce (sweet sorghum) or for their high levels of biomass (forage sorghum or “energy sorghum”) [6].
Plant breeders can potentially accelerate biomass production improvements in energy sorghum by using an HTP approach. Previous studies have pursued automation of phenotype-measurement techniques involving lasers, light curtains, ultrasonic sensors and artificial vision [7,8,9]. The use of phenotypic data along with genotypes of energy sorghum has been reported in a few recent studies [10,11,12,13]. Artificial vision systems are able to extract meaningful information from images, such as qualitative and quantitative morphologic variables [14], like plant height, width and shape, stem diameter and leaf number, area and angle. There is also the potential to use spectral characteristics in images to measure plant health, such as by correlating greenness to chlorophyll content [15,16,17,18]. Machine vision is potentially faster, more reliable and more robust than manual phenotyping methods [19]. A needed development in machine vision is the capability of working under difficult imaging conditions, like varying light levels, spectra and angles in crop fields, as well as varying temperature, etc. [20].
In a dedicated energy crop like energy sorghum, stalk thickness and plant height are the most significant phenotypes correlated to biomass, because a large majority of the plant dry matter resides in the stalk. Plant height can be measured with ultrasonic sensors, but an automated measuring system for stalk thickness remains a significant challenge because of the possible occlusion of the stalk and the difficulty of differentiating the stalk from connected stems. Numerous imaging techniques for HTP have been developed, and many are available at the Plant Image Analysis website [21]. Two of the available software resources at that website are Root Estimator for Shovelomics Traits (REST; [22]) and Digital Imaging of Root Traits (DIRT; [23]), both of which are dedicated to the analysis of images to measure root morphology, such as width, depth, length, shape and diameter. Neither of these incorporates measurement of distance between camera and plant material. It is clear that currently available software offers length measurement capabilities for plants, particularly root masses, but the authors are not aware of existing software dedicated to plant stalk measurement, particularly when camera distance is included. The first objective of this research was to evaluate common image analysis techniques as potential tools for measuring stalk thickness that could be used in automated HTP. The second objective was to evaluate an active depth camera as a tool for measuring distance between camera and plant, a measurement that would likely be necessary if an image-based stalk thickness measuring system were to be deployed.

2. Experimental Section

2.1. Materials

2.1.1. Energy Sorghum Material

Two genetic varieties of energy sorghum known as R07019 (Variety A) and R07007 (Variety B) were planted in plastic pots. These varieties have been studied as potential dedicated energy crops by Dr. John Mullet of Texas A&M University and were made available for this phenotyping study. Four plants per variety were planted, two plants to a pot. Each pot was supplied with 17 g of Osmocote 14-14-14 slow release fertilizer (Scotts). The plants were grown under well-watered conditions; i.e., the soil surface was kept moist throughout the experiment. All pots were progressively thinned to 1 plant per pot as the seedlings grew. The pots were placed in a greenhouse where environmental conditions were monitored with a TinyTag Data Logger© (Gemini Data Loggers). The average daily minimum temperature and relative humidity were 24.1 °C (±1.1 °C) and 36.0% (±14.0%), respectively. The average daily maximum temperature and relative humidity were 32.3 °C (±1.8 °C) and 69.9% (±11.3%). Plants emerged roughly five days after planting and were without branching nodes for the first few weeks. Thus, the first images were collected about three weeks after planting, when the plants began to exhibit thicker stalks and leaves emerging from the stalk.

2.1.2. Camera

An inexpensive (under $200 USD in 2015) combined camera (Creative Technology Ltd., Milpitas, CA, USA, Model VF0780), composed of a standard RGB camera and a 3D active camera (SoftKinetic Inc., Sunnyvale, CA, USA), was used to capture images of the sorghum plants. The 3D camera employs the time-of-flight (TOF) concept, measuring the time between emitting a pulse of light and receiving the reflected light, to measure depth values in a field-of-view plane. TOF cameras, or “depth cameras”, are commonly composed of a near-infrared (NIR) source and a CMOS pixel-array detector. The depth camera used includes a three-chip TOF system designed to be sensitive only in the 850 to 870-nm range, matching the NIR source. This TOF system provides a 320 × 240 resolution image in which pixel values relate to the distance from image objects to the emitter on the camera. Manufacturer specifications list the depth values’ precision as in the range of 5 to 300 cm, with the best performance from 15 cm to 100 cm. The field of view (FOV) of the RGB and depth cameras is not identical. The RGB camera resolution can be selected from 320 × 240 to 720 × 1280. Initially, 320 × 240 was maintained in the RGB images for consistency with the depth images, but in later stages of plant growth, the RGB resolution was increased to 720 × 1280 for more precise measurements of stalk thickness.

2.1.3. Imaging Environment

A polyvinyl chloride (PVC) pipe frame was built to hold a black cotton cloth to block sunlight in the greenhouse and to aid in isolating sorghum plants from their surroundings, facilitating plant classification during image analysis. The cloth also helped block possible incoming NIR energy that might interfere with the depth camera.

2.2. Methods

Over a period of ten weeks, images and stalk thickness measurements of the sorghum plants were collected on roughly a weekly basis. According to recommendations by Dr. William Rooney of Texas A&M University [24], stalk thickness was to be measured immediately above the plant’s third internode to provide consistent and representative measurements of the entire stalk. In the early stages of growth, sorghum plants did not exhibit internodes, so a measurement height of 7 cm above the soil was used until internodes were defined, a situation that occurred about six weeks after data collection began. Sorghum stalks have a roughly elliptical cross-section, so effective thickness was determined based on an elliptical shape model. Manual thickness measurements were made with a caliper. Both the wider (major-axis) and narrower (minor-axis) dimensions of the stalk were measured for each plant on each data collection date. The wider dimension of the sorghum stalk, which is perpendicular to the direction at which the leaves emerge, was oriented perpendicular to the camera viewing direction. A marker was placed on the pots during the first measurement so that each time measurements were made, the orientation of the plants would be the same. Distance from the camera lens to the background cloth was measured with a ruler to the nearest half cm.
Plants were placed between roughly 25 and 70 cm from the camera, as manually measured with a ruler to the nearest half cm, horizontally from the edge of the camera lens to the center of the sorghum stalk. Distances were varied to provide a range of distances to compare between manual and depth-camera measurements and also, in some cases, to provide adequate distance to image the entire plant. The camera was placed so that the point where caliper measurements were made was just below the image center. In the later stages when plants had developed internodes, the camera was placed so that the third internode was slightly below the center of the image. The camera was horizontally leveled to reduce possible error from image distortion that might occur if the camera were not perpendicular to the plant’s frontal plane.

2.2.1. Image-Distance Calibration

Image-distance calibration consisted of producing a conversion factor from pixels to length units (mm). A 47.6-mm diameter PVC tube was placed in front of the camera at distances from 20 to 90 cm in 10-cm increments, and an RGB image was collected at each distance. The number of pixels belonging to the object at each distance was determined, and an equation was developed to convert pixels to distance. The accuracy of the depth camera measurements was considered independently, and this analysis is considered below.

2.2.2. Image Analysis

To measure the stalk thickness of a sorghum plant in an image, pixels associated with the stalk must be segmented from pixels associated with other parts of the plant, as well as from the background. K-means clustering and minimum-distance classification were used in this work.

K-Means Algorithm

The K-means clustering algorithm used in this article was adapted from Peter Corke [25] and performs a color-based segmentation based on a pre-selected C number of clusters. The algorithm maps each RGB value to the color xy-chromaticity space and then identifies clusters to segregate the pixels into image classes associated with those clusters. The algorithm initially selects locations in the data space as cluster centers and then calculates the Euclidean distance [26] between each pixel and every cluster center in multiple iterations. At the end of an iteration, each pixel is assigned to the cluster with the closest center, and every cluster center location is recalculated as the average location of the pixels assigned to that cluster. The final result is a classification of the image into C number of classes [25,27,28,29].
Each image was segmented into ten classes. The user identified the five classes that best represented the plants based on whether or not the class clearly contained almost exclusively plant matter, and cluster center locations were stored to a text file. After segmentation, the noise in the image was reduced with a morphological opening with a 9 × 9 square kernel. The five selected classes were then combined to produce a binary image that was an initial separation of plant from background. The combined image was then processed by area opening to remove small artifacts. Finally, to reduce the effects of plant lesions and minor stalk deformities, a morphological closing with a 5 × 5 square kernel was applied to the image. The result was a binary image of the plant segmented from the background; Figure 1a.
Figure 1. RGB original, classified and binary images of sorghum plants during the third week of growth of June, Plant 3B. (a) Segmentation using K-means clustering; (b) segmentation using minimum-distance classification.
Figure 1. RGB original, classified and binary images of sorghum plants during the third week of growth of June, Plant 3B. (a) Segmentation using K-means clustering; (b) segmentation using minimum-distance classification.
Jimaging 02 00004 g001aJimaging 02 00004 g001b

Minimum Distance Algorithm

Minimum-distance classification of color images considers the three RGB components as orthogonal coordinates in the Euclidean space, and individual pixels are assigned to classes according to their RGB distance from pixel classes that have been defined by training. Four classes were defined by the user according to homogeneous zones in the images, as shown in Figure 2a. In this figure, the four classes are clearly separable [26]: the black cloth background, enclosed by the blue rectangle; the green sorghum plant leaf, enclosed by the green rectangle; the bright sorghum stalk, enclosed in orange; and the brown soil, enclosed in red. The spectral separability is shown in the histograms of Figure 3. The statistical properties shown are distribution, mean and standard deviation, which were used as the references for the minimum-distance classification algorithm. The mean is the center point to which pixel RGB-space distances are compared, while standard deviation is used as a threshold to determine whether or not a pixel is assigned to a defined class. Each image pixel was compared to all class means, and the shortest distance determined whether a pixel would be assigned to one class or another. A threshold of two standard deviations from all four classes was used to label pixels as “unclassified”. Once the images were processed, the plant regions were stored as binary images (Figure 1b) and used subsequently to calculate stalk thickness.
Figure 2. Images of sorghum Plant 1A on June 30. The minimum distance classification required training samples to segment the images: (a) original image and the location where the samples were obtained, enclosed by rectangles (blue = background; orange = bright stalk; green = plant leaf; red = soil); (b) image segmented into the five classes, including unclassified pixels.
Figure 2. Images of sorghum Plant 1A on June 30. The minimum distance classification required training samples to segment the images: (a) original image and the location where the samples were obtained, enclosed by rectangles (blue = background; orange = bright stalk; green = plant leaf; red = soil); (b) image segmented into the five classes, including unclassified pixels.
Jimaging 02 00004 g002

Stalk Thickness Calculation Algorithm

Stalk-thickness calculation was conducted in “supervised” and “unsupervised” modes. In the supervised mode, the user selected a location in the image to define the stalk center. A 3 × 3 pixel window was then used to automatically scan the image in search of the edges of the stalk. Once the edge pixels were identified by the algorithm, stalk thickness in pixels was calculated by subtracting the edge location on the right from the edge location on the left. In unsupervised mode, the user identified the stalk center and allowed the algorithm to select upper and lower stalk thickness measurement locations. The average thickness based on the upper and lower measurements was taken as the stalk thickness (Figure 4). In both modes, stalk thickness in pixels was multiplied by the pixel-length conversion factor, which was dependent on the distance between plant and camera, to obtain stalk thickness in mm.
Figure 3. Frequency histograms of RGB values for each training sample. The solid line in the middle of the distribution indicates the location of the mean value, while the dashed lines indicate the location of two times the standard deviation of the each training sample. The information from the histograms was used to define each class center cluster; meanwhile, the standard deviation was used to define the class threshold. (a) Background; (b) plant; (c) soil; (d) stalk.
Figure 3. Frequency histograms of RGB values for each training sample. The solid line in the middle of the distribution indicates the location of the mean value, while the dashed lines indicate the location of two times the standard deviation of the each training sample. The information from the histograms was used to define each class center cluster; meanwhile, the standard deviation was used to define the class threshold. (a) Background; (b) plant; (c) soil; (d) stalk.
Jimaging 02 00004 g003

Depth Camera Accuracy

For every plant image collected, a manual measurement of the distance between the camera and plant stalk was conducted with a ruler, to the nearest half centimeter. In the depth-camera images, the pixel values represented the distance between the object and the camera. The depth-camera images included some erroneous pixel values, so the average of a group of pixels belonging to the plant stalk was used to minimize the effect of noise in the images. Two representative areas of the stalk in the depth image were selected by the user, and a 3 × 3 window was selected from each area. The average value from the 18 selected pixels was used as the depth-camera distance between camera and plant. Depth-camera distances were evaluated by comparing them to manual stalk-to-camera distance measurements by way of simple linear regression.
Figure 4. Results of image analysis. The golden box and red lines show the location where the stalk thickness was measured during this procedure: (a) K-means clustering, supervised stalk selection; (b) minimum-distance classification, supervised stalk selection.
Figure 4. Results of image analysis. The golden box and red lines show the location where the stalk thickness was measured during this procedure: (a) K-means clustering, supervised stalk selection; (b) minimum-distance classification, supervised stalk selection.
Jimaging 02 00004 g004

3. Results and Discussion

The stalk thickness values determined through image analysis were compared to the manually-measured values and the percent error calculated for each plant and each date (Table 1). Averaged across all plants and dates, the average percent errors were 27.9 for K-means with unsupervised stalk measurement (KMU), 19.5 for minimum distance with unsupervised stalk measurement (MDU), 16.0 for K-means with supervised stalk measurement (KMS) and 16.7 for minimum distance with supervised stalk measurement (MDS). All KMS stalk-thickness measurements are plotted against the caliper values in Figure 5a. Linear regression of these data points resulted in an R2 value of 0.28, indicating that KMS measurements only account for 28% of the variability in the caliper measurements.
Table 1. Comparison of stalk thickness measurements using K-means unsupervised (KMU), K-means supervised (KMS), minimum-distance unsupervised (MDU) and minimum-distance supervised (MDS) modes. The plant ID shows the plant number and the variety. The measured value indicates the value obtained with the stalk thickness calculation algorithm. The points in bold were determined to be outliers and removed from the final analysis (see Figure 5).
Table 1. Comparison of stalk thickness measurements using K-means unsupervised (KMU), K-means supervised (KMS), minimum-distance unsupervised (MDU) and minimum-distance supervised (MDS) modes. The plant ID shows the plant number and the variety. The measured value indicates the value obtained with the stalk thickness calculation algorithm. The points in bold were determined to be outliers and removed from the final analysis (see Figure 5).
Measured Value (mm)Percent Error (%)
DatePlant IDCaliper Reading (mm)KMUMDUKMSMDSKMUMDUKMSMDS
30 June1A25.5824.1232.2728.4727.475.7126.1711.297.41
2A33.1728.9640.1330.1529.1812.6920.989.1212.04
3A30.8231.1831.4128.2526.881.171.918.3212.77
4A32.5935.7535.1132.8527.319.707.740.8016.20
1B29.7521.8430.6229.6531.4526.592.920.355.70
2B28.6127.5737.4134.6032.963.6430.7820.9515.19
3B27.7621.6526.2227.4925.3622.015.560.968.65
4B25.6260.4939.1432.7726.98136.1052.7927.905.29
7 July1A40.8119.0758.8545.0549.0953.2744.2010.3820.28
2A48.1434.4141.7036.7238.8328.5213.3823.7319.34
3A34.9120.7335.0526.6926.8540.620.4023.5523.08
4A44.3429.5036.8833.7634.5233.4716.8223.8622.15
1B38.3515.4431.8426.2429.4859.7416.9731.5823.13
2B32.7321.5232.2331.3329.4334.251.514.2910.07
3B29.0925.0730.9027.8630.4313.826.224.234.61
4B47.6520.1628.4828.8029.3157.6940.2339.5638.49
14 July1A36.5435.4434.2538.0532.103.016.274.1312.16
2A36.0440.8040.5034.4232.9213.2112.384.518.64
3A35.3841.1041.9837.5036.8716.1718.655.994.22
4A33.6841.5935.0937.1436.4723.494.2010.278.29
1B36.6835.4036.8132.7631.553.490.3710.6914.00
2B34.5131.2036.0834.9835.559.594.541.363.00
3B22.2235.3938.5729.8330.7159.2773.5934.2338.23
4B37.2844.0434.6334.3231.3718.137.117.9415.85
21 July1A20.4528.6031.5532.7731.8839.8554.2960.2455.91
2A25.5122.7725.9827.0026.7710.741.845.824.94
3A25.1724.3626.3424.4124.773.224.653.041.59
4A34.2631.4132.7432.1032.208.324.436.326.01
1B21.0116.1057.7418.6717.1123.37174.8111.1618.58
2B24.7537.9925.2026.2327.2553.491.835.9810.11
3B25.7043.9639.3944.2844.7571.0553.2872.3074.13
4B22.8937.2436.4138.1638.9962.6959.0766.7170.32
4 August1A21.0124.1223.3124.6623.1414.8010.9717.3510.12
2A25.2228.9629.1332.0932.0914.8315.5027.2427.24
3A25.5831.1826.3327.6626.9921.892.948.135.51
4A27.0335.7524.8329.0629.7032.268.157.499.88
1B20.4521.8421.5523.7923.546.805.3916.3115.11
2B23.0227.5727.4330.1830.4819.7719.1731.0832.41
3B22.2621.6521.2824.4625.902.744.429.8816.33
4B26.2260.4922.4924.1624.31130.7014.217.887.29
11 August1A24.2319.0723.3123.5523.6421.303.782.832.42
2A25.2334.4129.1329.1227.9636.3915.4615.4210.81
3A24.2520.7326.3322.8721.8614.528.595.699.86
4A26.1329.5024.8331.9930.2612.904.9922.4315.80
1B19.1415.4421.5517.3217.4019.3312.609.519.07
2B20.3121.5227.4320.1320.325.9635.070.910.02
3B21.1925.0721.2827.7828.2618.310.4131.0833.38
4B21.6820.1622.4920.8020.967.013.764.063.34
Close inspection of Figure 5a indicated that several of the data points may be outliers, so the following procedure (adapted from [30,31]) was used in an effort to identify actual outliers that should be removed from consideration. (1) Residuals were calculated and plotted against the caliper measurements, indicating heteroscedasticity (i.e., residual values were lower at low stalk thickness values and higher at high stalk thickness values). (2) Based on the fact that residuals were heteroscedastic, both KMS and caliper measurements were transformed by taking their logarithms. Log(caliper) was regressed against log(KMS), and residuals were again plotted, indicating that heteroscedasticity had been mitigated. (3) Studentized residuals of the log(caliper) vs. log(KMS) relationship were calculated, and the distribution of the studentized residuals was determined. At the positive and negative tails of the distribution, seven data points were identified as being significantly disconnected from the rest of the data. This evaluation provided the basis for identifying suspected possible outliers. (4) The original images associated with the suspected outliers were observed to determine whether anomalies that could produce major errors could be identified. With each of the suspected outliers, the images suggested that significant error-causing anomalies were present, so these data points were removed, and simple linear regression was conducted on the resulting dataset (Figure 5b). Results of the analysis after removing outliers provided an R2 value of 0.70, indicating that KMS measurements accounted for 70% of the variability in the caliper measurements. Furthermore, with the outliers (identified in Table 1) removed, the resulting average error for KMS measurements was reduced to 10.7%.
Figure 5. Linear regressions for caliper measurements vs. K-means supervised measurements: (a) the regression before removing outliers (R2 = 0.28, RMSE 6.27 mm); (b) after removal of outliers (R2 = 0.70, RMSE 3.19 mm). Plot units are in mm.
Figure 5. Linear regressions for caliper measurements vs. K-means supervised measurements: (a) the regression before removing outliers (R2 = 0.28, RMSE 6.27 mm); (b) after removal of outliers (R2 = 0.70, RMSE 3.19 mm). Plot units are in mm.
Jimaging 02 00004 g005
As a representation of the image classification errors, Figure 6 shows the original and two classified images of Plant 4B on 30 June, which had the highest error of all plants measured on that date (except with MDS). It can be observed that this plant had leaves emerging from the sides of the stalk over most of the stalk area. This problem made it difficult for the stalk measurement algorithm to identify the actual edges of the stalk, contributing error to the measurement. On 7 July, the plant with the highest percent error had non-plant pixels in the binary image of the stalk, indicating poor segmentation that led the analysis to measure only part of the stalk. These are the types of errors common to the high error image-based measurements.
Figure 6. The images presented are examples of anomalies in the experiment. K-means, unsupervised (KMU), minimum-distance, unsupervised (MDU) and K-means, supervised (KMS) were three out of the four different methods in this experiment. (a) 30 June, 4B; (b) 30 June, 4B, KMU (136.10); (c) 30 June, 4B, MDU (52.79). In parenthesis, the error between the manual measurement and algorithm measurement is shown (see also Table 1).
Figure 6. The images presented are examples of anomalies in the experiment. K-means, unsupervised (KMU), minimum-distance, unsupervised (MDU) and K-means, supervised (KMS) were three out of the four different methods in this experiment. (a) 30 June, 4B; (b) 30 June, 4B, KMU (136.10); (c) 30 June, 4B, MDU (52.79). In parenthesis, the error between the manual measurement and algorithm measurement is shown (see also Table 1).
Jimaging 02 00004 g006
In terms of the plant growth stage at which measurements were most accurate, later-stage plants had lower error than early-stage plants. By the seven-week point, the plants had developed reasonably well-defined stalks that were easier to segment and measure. The relatively low percent error at later growth stages showed the techniques to be most accurate at the time phenotyping would be most useful. In terms of which methods were most accurate, K-means clustering was more accurate with supervised stalk measurement, but minimum-distance classification was more accurate with unsupervised stalk measurement. Supervised stalk measurement had lower error than unsupervised when using either K-means or minimum distance. This result suggests that the position where stalk thickness is measured is critical to the accuracy of the measurement, and future research should focus on how the image-processing algorithm determines the best measurement location. As previously mentioned, the most obvious errors in measuring stalk thickness occurred when leaves protruding from the stalk were included in the stalk-thickness measurement. Additional image-processing techniques—e.g., to consider whether the left and right stalk edges are parallel—could better isolate the stalk, enabling the largest stalk thickness measurement errors to be mitigated. In other cases, the source of major error was either unknown or related to poor image segmentation, probably due to lack of contrast at particular points on the plant. To mitigate these errors, more research would be required, as solutions are not obvious.
The results of comparing depth-camera distance measurements to manual camera-to-stalk distance measurements are presented in Figure 7. The plot of data points indicates a high level of linearity, and the R2 value of 0.96 indicates that the depth camera accounted for 96% of the variability in actual camera-to-stalk distance. The average error for camera-to-stalk distance on a given day ranged from about 5% to 20%, while the root mean square error (RMSE) was 2.5 cm. It is clear from the plot that at camera-to stalk distances greater than 65 cm, the percent error is low. Stalk thickness error associated with a depth-camera distance RMSE of 2.5 cm is approximately 3%, meaning that the roughly 10% average stalk thickness error resulting from image processing is large compared to depth-camera error. The depth-camera feature of this combined camera thus appears to be acceptably accurate to provide distance data that would enable calculation of pixel-to-length conversion factors. If this capability could be combined with the image-processing techniques to segment the plant and calculate the number of pixels associated with the thickness of a stalk, then stalk thickness measurement for energy sorghum plants could be fully automated. The issue of pixel-to-pixel correspondence between depth images and RGB images is an issue remaining to be solved.
Figure 7. Linear regression of distance measurements from the camera to the plant, measured manually and measured with the depth camera images. The R2 value calculated for this relationship is 0.96, and the RMSE is 2.5 cm.
Figure 7. Linear regression of distance measurements from the camera to the plant, measured manually and measured with the depth camera images. The R2 value calculated for this relationship is 0.96, and the RMSE is 2.5 cm.
Jimaging 02 00004 g007

4. Conclusions

This study showed that an inexpensive combined (RGB plus depth) camera and common image analysis techniques (K-means clustering and minimum-distance classification) have potential for measuring stalk thickness in a high-throughput phenotyping system for energy sorghum plants. Furthermore, the integrated near-infrared time-of-flight depth camera showed excellent potential for providing distance measurements that would be required if stalk thickness were to be measured in an automated way. Incidentally, these techniques could be applied to other monocots, such as maize and rice. If successfully implemented, such a combined system could greatly reduce the labor and time required to make such measurements in experiments involving phenotyping, ultimately enabling many more plants to be measured than currently possible and potentially accelerating crop improvement.

Acknowledgments

The authors acknowledge the contributions of Brock Weers, Assistant Research Scientist, and John Mullet, Professor, both in the Department of Biochemistry and Biophysics at Texas A&M University. Weers planted and maintained the energy sorghum plants, and Mullet provided materials for the plant growth, space in the greenhouse and advice on sorghum phenotyping.

Author Contributions

Jose Batz developed procedures for data collection, unsupervised K-means clustering and stalk diameter measurement. He also collected and analyzed data and wrote significant portions of the manuscript. Mario A. Méndez compiled a review of the literature, developed procedures for minimum distance classification and wrote portions of and compiled the manuscript. J. Alex Thomasson oversaw the research and reviewed and approved the methods used, as well as the results obtained. Additionally, he edited and added final interpretive comments to the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Davies, J. Better distribution holds the key to future food security for all. Poult. World 2015, 170, 1–19. [Google Scholar]
  2. Koçar, G.; Civas, N. An overview of biofuels from energy crops: Current status and future prospects. Renew. Sustain. Energy Rev. 2013, 28, 900–916. [Google Scholar] [CrossRef]
  3. Chapman, S.C.; Chakraborty, S.; Dreccer, M.F.; Howden, S.M. Plant adaptation to climate change—Opportunities and priorities in breeding. Crop Pasture Sci. 2012, 63, 251–268. [Google Scholar] [CrossRef]
  4. Calviño, M.; Messing, J. Sweet sorghum as a model system for bioenergy crops. Curr. Opin. Biotechnol. 2012, 23, 323–329. [Google Scholar] [CrossRef] [PubMed]
  5. Cobb, J.N.; DeClerck, G.; Greenberg, A.; Clark, R.; McCouch, S. Next-generation phenotyping: Requirements and strategies for enhancing our understanding of genotype–phenotype relationships and its relevance to crop improvement. Theor. Appl. Genet. 2013, 126, 867–887. [Google Scholar] [CrossRef] [PubMed]
  6. Regassa, T.H.; Wortmann, C.S. Sweet sorghum as a bioenergy crop: Literature review. Biomass Bioenergy 2014, 64, 348–355. [Google Scholar] [CrossRef]
  7. Busemeyer, L.; Mentrup, D.; Möller, K.; Wunder, E.; Alheit, K.; Hahn, V.; Maurer, H.P.; Reif, J.C.; Würschum, T.; Müller, J.; et al. BreedVision—A Multi-Sensor Platform for Non-Destructive Field-Based Phenotyping in Plant Breeding. Sensors 2013, 13, 2830–2847. [Google Scholar] [CrossRef] [PubMed]
  8. Kicherer, A.; Herzog, K.; Pflanz, M.; Wieland, M.; Rüger, P.; Kecke, S.; Kuhlmann, H.; Töpfer, R. An Automated Field Phenotyping Pipeline for Application in Grapevine Research. Sensors 2015, 15, 4823–4836. [Google Scholar] [CrossRef] [PubMed]
  9. Klodt, M.; Herzog, K.; Töpfer, R.; Cremers, D. Field phenotyping of grapevine growth using dense stereo reconstruction. BMC Bioinf. 2015, 16, 1–11. [Google Scholar] [CrossRef] [PubMed]
  10. Upadhyaya, H.; Wang, Y.H.; Sharma, S.; Singh, S.; Hasenstein, K. SSR markers linked to kernel weight and tiller number in sorghum identified by association mapping. Euphytica 2012, 187, 401–410. [Google Scholar] [CrossRef] [Green Version]
  11. Mullet, J.; Morishige, D.; McCormick, R.; Truong, S.; Hilley, J.; McKinley, B.; Anderson, R.; Olson, S.; Rooney, W. Energy Sorghum—A genetic model for the design of C4 grass bioenergy crops. J. Exp. Bot. 2014, 65, 1–11. [Google Scholar] [CrossRef] [PubMed]
  12. van der Weijde, T.; Alvim Kamei, C.L.; Torres, A.F.; Wilfred, V.; Oene, D.; Visser, R.G.F.; Trindade, L.M. The potential of C4 grasses for cellulosic biofuel production. Front. Plant Sci. 2013, 4, 1–18. [Google Scholar] [CrossRef] [PubMed]
  13. Trouche, G.; Bastianelli, D.; Cao Hamadou, T.V.; Chantereau, J.; Rami, J.F.; Pot, D. Exploring the variability of a photoperiod-insensitive sorghum genetic panel for stem composition and related traits in temperate environments. Field Crops Res. 2014, 166, 72–81. [Google Scholar] [CrossRef]
  14. Chéné, Y.; Rousseau, D.; Lucidarme, P.; Bertheloot, J.; Caffier, V.; Morel, P.; Belin, É.; Chapeau-Blondeau, F. On the use of depth camera for 3D phenotyping of entire plants. Comput. Electron. Agric. 2012, 82, 122–127. [Google Scholar] [CrossRef] [Green Version]
  15. Vollmann, J.; Walter, H.; Sato, T.; Schweiger, P. Digital image analysis and chlorophyll metering for phenotyping the effects of nodulation in soybean. Comput. Electron. Agric. 2011, 75, 190–195. [Google Scholar] [CrossRef]
  16. Chaivivatrakul, S.; Tang, L.; Dailey, M.N.; Nakarmi, A.D. Automatic morphological trait characterization for corn plants via 3D holographic reconstruction. Comput. Electron. Agric. 2014, 109, 109–123. [Google Scholar] [CrossRef]
  17. Roscher, R.; Herzog, K.; Kunkel, A.; Kicherer, A.; Töpfer, R.; Förstner, W. Automated image analysis framework for high-throughput determination of grapevine berry sizes using conditional random fields. Comput. Electron. Agric. 2014, 100, 148–158. [Google Scholar] [CrossRef]
  18. Aksoy, E.E.; Abramov, A.; Wörgötter, F.; Scharr, H.; Fischbach, A.; Dellen, B. Modeling leaf growth of rosette plants using infrared stereo image sequences. Comput. Electron. Agric. 2015, 110, 78–90. [Google Scholar] [CrossRef]
  19. Fahlgren, N.; Gehan, M.A.; Baxter, I. Lights, camera, action: High-throughput plant phenotyping is ready for a close-up. Curr. Opin. Plant Biol. 2015, 24, 93–99. [Google Scholar] [CrossRef] [PubMed]
  20. Kolukisaoglu, Ü.; Thurow, K. Future and frontiers of automated screening in plant sciences. Plant Sci. 2010, 178, 476–484. [Google Scholar] [CrossRef]
  21. Lobet, G.; Draye, X.; Périlleux, C. An online database for plant image analysis software tools. Plant Methods 2013, 9. [Google Scholar] [CrossRef] [PubMed]
  22. Colombi, T.; Kirchgessner, N.; Le Marié, C.; York, L.; Lynch, J.; Hund, A. Next generation shovelomics: Set up a tent and REST. Plant Soil 2015, 388, 1–20. [Google Scholar] [CrossRef]
  23. Bucksch, A.; Burridge, J.; York, L.M.; Das, A.; Nord, E.; Weitz, J.S.; Lynch, J. Image-Based High-Throughput Field Phenotyping of Crop Roots. Plant Physiol. 2014, 166, 470–486. [Google Scholar] [CrossRef] [PubMed]
  24. Rooney, W.L.; (Soil and Crop Science Professor, Texas A&M University, College Station, TX, USA) Stalk diameter measure location in energy sorghum. Personal communication, 2015.
  25. Corke, P. Robotics, Vision and Control: Fundamental Algorithms in MATLAB; Springer Science & Business Media: Berlin, Germany, 2011; Volume 73. [Google Scholar]
  26. Ghimire, S.; Wang, H. Classification of image pixels based on minimum distance and hypothesis testing. Comput. Stat. Data Anal. 2012, 56, 2273–2287. [Google Scholar] [CrossRef]
  27. Liang, B.; Zhang, J. KmsGC: An Unsupervised Color Image Segmentation Algorithm Based on K-Means Clustering and Graph Cut. Math. Probl. Eng. 2014, 2014, 464875. [Google Scholar] [CrossRef]
  28. Sert, E.; Okumus, I.T. Segmentation of mushroom and cap width measurement using modified K-means clustering algorithm. Adv. Electr. Electron. Eng. 2014, 12, 354–360. [Google Scholar] [CrossRef]
  29. Yu, W.; Wang, J.; Ye, L. An Improved Normalized Cut Image Segmentation Algorithm with K-means Cluster. Appl. Mech. Mater. 2014, 548–549, 1179–1184. [Google Scholar] [CrossRef]
  30. Neter, J.; Kutner, M.H.; Nachtsheim, C.J.; Wasserman, W. Applied Linear Statistical Models, 4th ed.; McGraw-Hill: Boston, MA, USA, 1996. [Google Scholar]
  31. Cline, D. Statistical Analysis; STAT 601 Class Notes; Texas A&M University: College Station, TX, USA, 2015. [Google Scholar]

Share and Cite

MDPI and ACS Style

Batz, J.; Méndez-Dorado, M.A.; Thomasson, J.A. Imaging for High-Throughput Phenotyping in Energy Sorghum. J. Imaging 2016, 2, 4. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging2010004

AMA Style

Batz J, Méndez-Dorado MA, Thomasson JA. Imaging for High-Throughput Phenotyping in Energy Sorghum. Journal of Imaging. 2016; 2(1):4. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging2010004

Chicago/Turabian Style

Batz, Jose, Mario A. Méndez-Dorado, and J. Alex Thomasson. 2016. "Imaging for High-Throughput Phenotyping in Energy Sorghum" Journal of Imaging 2, no. 1: 4. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging2010004

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop