Next Article in Journal
Apoptotic, Anti-Inflammatory Activities and Interference with the Glucocorticoid Receptor Signaling of Fractions from Pistacia lentiscus L. var. chia Leaves
Next Article in Special Issue
Identification and Pathogenicity of Paramyrothecium Species Associated with Leaf Spot Disease in Northern Thailand
Previous Article in Journal
Hydrogen Sulfide Promotes Adventitious Root Development in Cucumber under Salt Stress by Enhancing Antioxidant Ability
Previous Article in Special Issue
Development of a Real-Time Loop-Mediated Isothermal Amplification Assay for the Rapid Detection of Olea Europaea Geminivirus
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

YOLO-JD: A Deep Learning Network for Jute Diseases and Pests Detection from Images

1
College of Information Sciences and Technology, Donghua University, Shanghai 201620, China
2
State Key Laboratory for Modification of Chemical Fibers and Polymer Materials, Donghua University, Shanghai 201620, China
3
Engineering Research Center of Digitized Textile and Fashion Technology, Ministry of Education, Donghua University, Shanghai 201620, China
4
Department of Chemistry, Faculty of Science, National University of Bangladesh, Gazipur, Dhaka 1704, Bangladesh
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 26 February 2022 / Revised: 25 March 2022 / Accepted: 27 March 2022 / Published: 30 March 2022
(This article belongs to the Special Issue Epidemiology and Control of Plant Diseases)

Abstract

:
Recently, disease prevention in jute plants has become an urgent topic as a result of the growing demand for finer quality fiber. This research presents a deep learning network called YOLO-JD for detecting jute diseases from images. In the main architecture of YOLO-JD, we integrated three new modules such as Sand Clock Feature Extraction Module (SCFEM), Deep Sand Clock Feature Extraction Module (DSCFEM), and Spatial Pyramid Pooling Module (SPPM) to extract image features effectively. We also built a new large-scale image dataset for jute diseases and pests with ten classes. Compared with other state-of-the-art experiments, YOLO-JD has achieved the best detection accuracy, with an average mAP of 96.63%.

1. Introduction

Jute (Corchorus olitorius L. or C. capsularis L.) is one of the most important fiber crops and an inexpensive fiber source of high quality. It is also referred to as the “golden fiber” crop. Jute is a strong fiber that is soft, lustrous, and relatively lengthy. In the Indo-Bangladesh subcontinent, commercial jute farming is mostly limited to the latitudes of 80°18′ E–92° E and 21°24′ N–26°30′ N [1]. Jute is a herbaceous annual that may grow to a height of 10 to 12 feet (3 to 3.6 m) and has a cylindrical stalk about the thickness of a finger. It is said to have originated in the Indian subcontinent. The main differences between the two jute species cultivated for fiber lie in the form of their seed pods, growth habits, and fiber qualities. Most types of jute prefer well-drained sandy loam in warm and humid areas with at least 3 to 4 inches (7.5 to 10 cm) of monthly rainfall during the growing season. The light green leaves of the plant are usually 4 to 6 inches (10 to 15 cm) long, 2 inches (5 cm) broad, serrated on the margins, and taper to a point.
Jute is a biodegradable natural polymer that decomposes quickly in the environment. On the other hand, although synthetic polymers including polystyrene, polyethylene, polypropylene, and polyvinyl chloride offer better mechanical qualities, sustainability, and durability than natural polymers for producing plastics, they are not bio-degradable and can seriously pollute the environment [2]. Plastic pollution is one of the biggest environmental issues nowadays. In 2019, plastic manufacturing and incineration produced more than 850 million metric tons of greenhouse gases, the equivalent of the emissions from 189 coal power plants with a 500 megawatt capacity [3]. The most serious issue with plastic is that it does not decompose in the environment and has accumulated for decades in streams, agricultural soils, rivers, and the ocean [3]. To protect the environment, it is necessary to substitute synthetic polymers with bio-degradable and ecologically friendly polymers. Therefore, natural jute polymer has become increasingly popular in both domestic and international markets. The process of jute fiber production comprises several steps (shown in Figure 1). In addition, due to the importance of jute production, the disease prevention in the jute species has become an urgent task of precision agriculture.
Plant diseases and pests are a global threat to crop yields, and they may be even more destructive for smallholder farmers whose livelihoods depend heavily on healthy harvests. Unfortunately, jute is still usually cultivated by smallholder farmers. Disease symptoms appear on leaves, fruits, buds, and young branches on jute plants. Jute diseases come in a variety of types, each of which can result in big economic loss. Recently, disease prevention has become increasingly significant as a result of the demand for finer quality fiber [4]. Precision plant protection offers a non-destructive means of managing plant diseases based on the concept of spatio-temporal variability [5,6], and those works have inspired us to transplant new technology from the computer vision and artificial intelligence fields to detect and manage plant diseases.
In this scenario, early and precise detection of plant diseases and pests is critical for avoiding losses in agricultural production. Traditionally, the detection of plant diseases and pest is manually performed by experts such as botanists and agricultural engineers. The disease investigation usually begins with a visual assessment and then a laboratory test. Detection of plant diseases and pests by visual inspection is extremely beneficial for new farmers. Traditional approaches are typically time consuming and need complex procedures, as well as some specialized knowledge. Therefore, during the past several years, researchers have used image processing and machine learning techniques to detect or classify plant diseases. For example, Maniyath et al. [7] proposed a classification architecture using a machine learning approach to detect plant diseases and pests. Gavhale et al. [8] proposed a framework using K-means clustering to recognize the defects and areas of disease on plant leaves. Hossain et al. [9] used a Support Vector Machine (SVM) to recognize the diseases on tea leaves.
Deep learning (DL) has previously been proven to be successful for real-life object identification, recognition, and classification [10]. The agricultural industry has resorted to DL-based models for the solution. State-of-the-art outcomes have been achieved using deep learning approaches on tasks such as plant identification, fruit harvesting, and crop/weed classification. Recent research has also concentrated on the detection of plant disease [11]. Convolutional neural networks such as YOLOv3 [12], YOLOv4 [13], Faster R-CNN [14], Mask R-CNN [15], and SSD [16] were successfully applied in crop disease detection. For example, Hammad et al. [17] realized image-based plant disease identification by meta-architectures based on deep learning. Chowdhury et al. [18] proposed a deep learning model based on EfficientNet and they used 18,161 tomato leaf images to classify tomato diseases. A previous article by Mohanty et al. [19] using AlexNet and GoogleNet models was able to identify 14 crop species and 26 diseases from images. They used a dataset comprising 54,306 images of both diseased and healthy plant leaves. Görlich et al. [20] proposed a UAV-Based classification of Cercospora leaf spot disease on RGB images. Chen et al. [21] designed a model that could automatically detect rubber tree diseases on images using an improved YOLOv5 model. Arsenovic et al. [22] proposed a new deep learning approach to detect 13 different types of plant diseases. Vishnoi et al. [23] developed two different DL approaches to detect diseases in the PlantVillage dataset. Wagle et al. [24] proposed a CNN model with transfer learning from AlexNet to detect nine species of plants from the PlantVillage dataset.
There are still several existing limitations in disease identification for jute plants; e.g., (i) the lack of a dataset for jute diseases; (ii) the majority of the current approaches on plant disease detection are based on traditional machine learning methods, generating unsatisfactory performances; (iii) the research on jute disease detection via image processing is rare; and (iv) it is difficult to implement multi-class disease detection because different diseases have very diversified appearances.
To transcend the mentioned above limitations, we formulate the objectives of this research as follows. The first objective of this research is to establish a brand new image dataset for jute diseases and pests with accurate manual labels, which should not only include several thousands of images captured at different environments and weather conditions, but should also incorporate multiple disease/pest classes. To the best of our knowledge, the dataset will be the first published jute disease and pest image dataset in the field. The second objective is to explore new network architectures and modules under deep learning that is fit for crop diseases detection, especially for jute diseases. The new network is also expected to outcompete some popular networks designed for object detection. The third objective is to validate the application feasibility of the deep learning models in YOLO-family on detection (or recognition) of jute diseases and pests, and provide guidance for scientists working on both agricultural engineering and artificial intelligence.
The content of this paper is structured as follows. The Jute diseases and pests dataset and the architecture of our detection model YOLO-JD are specified in Section 2. Experimental results with the ablation study are provided in Section 3. Discussion of the results takes place in Section 4. Conclusions are drawn in the last section.

2. Material and Methods

2.1. Dataset

The images of jute diseases and pests were collected at Jamalpur and Narail districts in Bangladesh in July 2021. To diversify the dataset, the images were captured over the course of a single day under both sunny and cloudy weather. The images were captured by a Canon Powershot G16 camera and the camera of a Samsung Galaxy S10 with different viewing angles and different distances (0.3–0.5 m). In total, 4418 images in multiple jute disease and pest classes were obtained. The light intensity and background circumstance of the images vary greatly in the dataset. Though the image sizes are not uniform in our dataset, we prepare a normalization step at the beginning of the network to unify all images to a fixed resolution of 640 × 640. Eight common diseases including stem rot, anthracnose, black band, soft rot, tip blight, dieback, jute mosaic, and jute chlorosis, as well as two pests—Jute Hairy Caterpillar, and Comophila sabulifers—are incorporated into our dataset. Some of the sample images are displayed in Figure 2, and the symptomatic patterns and causes [25] of all Jute diseases and pests are listed in Table 1, respectively.
Stem rot usually causes long and blackened rotted areas on the main stem of jutes. This disease is economically the most serious disease for jute. Stem rot reduces the yield of fiber both quantitatively and qualitatively, and even produces infected seeds to the next generation. The symptoms of anthracnose disease are sunken spots of various colors on different parts of plants, usually observed on stems. Irregular spots of anthracnose disease often cause deep necrosis spots on stems, and may further result in cracks on the fiber, and even the withering of the infected plant. The disease can also infect jute seeds; the infected ones are lighter in color, with shrunken shapes and poor germination. Black band was a minor disease in the past, but now it has become more prevalent due to climate change. Black band disease causes dark-colored areas on the infected stem, together with the defoliation of plants. Initially, it may often be confused with Stem rot because the infected areas are both spot-like. Soft rot disease is still a common fungal disease on jute plants. The disease may appear on all growing areas of a jute plant but the intensity of the disease is usually low. Attack of soft rot happens when the Jute crop reaches 80–90 days old. The fungus grows from soil and later slowly infects fallen leaves of jute, and from there it goes up to the stem base and then travels to other parts of the plant. In the past, tip blight was a minor disease but now it has developed into different new varieties. The disease causes the blighting of newly emerged sprouts at the tip of young plants. The infected sprouts turn from green to black and then slowly become rotten in high humidity. Dieback disease is relatively rare. The dieback disease usually happens at the top of the plant, and leaves begin to droop and wither, they later become dried up. Infected branches slowly turn brown and later black, and remain attached as dead and dry parts. Jute mosaic is a common disease nowadays. The disease creates small yellow dots (like flakes) on the leaf lamina in the early stage, then gradually the dots enlarge themselves to become yellow mosaics on leaves. Chlorosis of jute causes yellow chlorotic spots with sharp margins on the leaves. The Cosmophila sabulifera and the Hairy Caterpillar are two common pests on jute leaves. The first caterpillar is much larger than the latter one, and individuals of Hairy Caterpillar usually are inclined to aggregate into a flock.
The entire preparation procedure of our jute disease and pest dataset is as follows. First, we apply image pre-processing methods such as brightness correction andimage filtering on sample images to enhance the quality of the dataset. In the dataset, 556 images were selected to form the testing dataset, and the rest of the 3862 images were used to form the training set. Then, an annotation software called ‘LabelImg’ [26] was used to draw the ground truth bounding boxes of the disease or pests in all images.

2.2. Overall Architecture

Based on YOLOv4, the YOLOv5 improved in terms of both detection performance and computational complexity, making it to be perhaps the most popular solution for object detection tasks nowadays. Despite its popularity, the standard YOLOv5 still has a problem in generalization and domain adaptation (e.g., a performance decline can be observed on our jute dataset when applying YOLOv5). Inspired from YOLOv5, this research proposes a unique model: YOLO-JD, for detection and recognition of jute disease and pests by evaluating the architecture. Figure 3 shows the overall architecture of the proposed YOLO-JD, which can be divided into three main components—(i) the head (backbone) component, a backbone network that uses the Sand Clock Feature Extraction Module (SCFEM), Spatial Pyramid Pooling Module (SPPM), and the Deep Sand Clock Feature Extraction Module (DSCFEM) to extract features at different levels; (ii) the neck component, that collects cross-stage features extracted from three different layers of the head component, and then generates three different high-level feature maps; and (iii) the detection component, that incorporates anchor results under different scales to create an aggregated detection box. The full architecture of YOLO-JD also contains several kinds of compact operations and calculation steps such as CBL (Conv2D + Batch Normalization + Leaky ReLU activation), NMS (Non-max Suppression), Up-sampling (Us), and Concatenation.
In the head component, the input feature dimension changes from 640 × 640 × 3 to 320 × 320 × 32 after the focus module with a shuffling scheme shown in Figure 3. The features then pass through several different operations and modules such as SCFEM, CBL, DSCFEM, and SPPM, and generate a multi-level output for the neck component. The multi-level output includes three feature maps, two of which are the outputs of DSCFEMs, and the other one is the output of SCFEM. In each CBL operation, we sequentially carry out Conv2D, batch normalization, and the Leaky ReLU activation. The DSCFEMs are used in the first component of YOLO-JD network to collect important low-level image features. The SCFEMs are applied mainly in the middle component for the extraction of mid-level features.
Finally, the detection component is a standard scheme inherited from the YOLO family and it creates multi-scale grids for detecting objects with different sizes (e.g., grid size 8 × 8 for detecting small objects, grid size 16 × 16 for detecting medium objects, and grid size of 32 × 32 for detecting big objects). After that, we use 1 × 1 convolution to combine all feature maps to create 9 different anchor results and carry out K-means clustering to combine 9 anchor boxes on the feature map output from the previous layer. In the final stage, the Non-max Suppression (NMS) operation was applied to select only one bounding box out of many overlapping ones as the final detection.

2.2.1. Sand Clock Feature Extraction Module (SCFEM)

The Sand Clock Feature Extraction Module (SCFEM) is designed to extract high-quality mid-level features from the input image. The detailed architecture of SCFEM is given in Figure 4a. The backbone component of YOLO-JD contains two SCFEM modules, and we insert four SCFEM modules in the neck component of YOLO-JD. The SCFEM module contains an important feature extraction block called the Sand Clock Operation (SCO). In SCO there are five steps. The first step is a 1 × 1 conv followed by a BL operation (batch normalization + Leaky ReLU activation), then the second step uses two spatially separable convolutions (3 × 1 conv + 1 × 3 conv) followed by BL to abstract features. The third step has a 1 × 1 conv followed by a BL operation, which creates a “thin” feature map. The fourth step is similar to the second one. And the last step of SCO is still a 1 × 1 conv followed by an activation function such as ReLU. When passing through the calculation of the five steps above, the feature maps first gradually become small and then become bigger, taking the shape of a sand clock. Thus, the block is named SCO. The step of two spatially separable convolutions (3 × 1 conv + 1 × 3 conv) is used to replace the traditional 3 × 3 conv because the former has fewer parameters to compute. In SCFEM, we also use multiple 1 × 1 convs and 3 × 3 convs, as well as a skip connection, to enhance its feature extraction ability. In the neck component of YOLO-JD, three SCFEMs are applied to generate three high-level feature layers with different scales to serve the following detection purposes, respectively.

2.2.2. Deep Sand Clock Feature Extraction Module (DSCFEM)

The biggest difference between Deep Sand Clock Feature Extraction Module (DSCFEM) and SCFEM is that there are three consecutive SCO blocks in the DSCFEM (Figure 4b). This structure makes the structure of DSCFEM much deeper than SCFEM, and also explains why the name begins with “deep”. The second difference between DSCFEM and SCFEM is that we only use 1 × 1 convs in the part outside SCOs. As the DSCFEM has a deeper design than SCFEM, the wide usage of 1 × 1 convs rather than 3 × 3 convs can reduce the network parameters, and shortens the training and referencing time. DSCFEM is applied two times in the backbone component of YOLO-JD. The module is responsible for the efficient abstraction of low-level image features.

2.2.3. Spatial Pyramid Pooling Module (SPPM)

The Spatial Pyramid Pooling Module (SPPM) works only once in the backbone component and its feature output becomes the input of the SCFEM with the highest resolution. The SPPM (shown in Figure 4c) first creates a feature pyramid and then uses convolutions to integrate the features under different scales. In SPPM, the network learns the object features with different receptive fields, and feature maps with multiple receptive fields are naturally fused to create an effective feature embedding that has both local and global focuses. In implementation, we use three different convolution kernels to generate multi-layer pyramid maps, which are then separately max-pooled and concatenated to generate the output. Due to its multi-scale feature extraction, SPPM may enhance the recognition of objects at varied sizes.

2.3. Loss Functions

A comprehensive loss function is designed for training YOLO-JD, and this loss function contains several different sub-losses.
The first sub-loss is the Intersection over Union (IoU) loss, which descends from a basic criterion used frequently in target detection and tracking. IoU is defined as the ratio of the intersection of the prediction box B p and its Ground Truth (GT) box B g t to the union of the prediction and its GT, and IoU loss is given as:
L I o U ( B p , B g t ) = 1 | B p B g t | | B p B g t | .
In most cases, IoU can reasonably evaluate the detection performance. However, when there is no intersection between a prediction and its GT, the IoU loss reaches the maximum value 1.0. It then becomes impossible to distinguish the relative distance between the area of GT and the predicted area since a very bad prediction (very far away) and a not so bad prediction (near but still no intersection) are punished with the same L I o U . To improve the training, we resort to CIoU loss (Complete IoU) [27], a generalized IoU sub-loss that takes three geometric factors into account—i.e., the overlapping factor (the standard IoU loss), the distance factor D, and the aspect ratio factor V. The L C I o U can then be defined as follows,
L C I o U = L I o U ( B p , B g t ) + D ( B p , B g t ) + V ( B p , B g t ) ,
where D and V denote the distance factor and the aspect ratio factor, respectively. The distance factor D ( B p , B g t ) is also a binary function that accepts the information of the GT area and the prediction. The equation of the distance factor can be written as
D ( B p , B g t ) = ( b p b g t ) 2 c 2 ,
in which b p   and b g t   are of the central point coordinates of boxes B p and B g t , respectively. In addition, c is the diagonal length of the bounding box that circumscribes B p and   B g t . The distance factor D ( B p , B g t ) can be regarded as a normalized distance between two boxes.
The aspect ratio factor can be calculated via the following equation,
V = 4 π 2 ( arctan ω g t h g t arctan ω p h p ) 2 ,
where h and ω represent the width and height of a bounding box, respectively. The lower the L C I o U is, the better the predicted box approximates the ground truth.

3. Experiments

3.1. Evaluation Metrics

For our jute disease dataset, each detected bounding box can be categorized into three cases. The true positive (TP) indicates that the detected box has an IoU value (defined as | B p B g t | / | B p B g t | ) higher than 50% against its ground truth box. The false positive (FP) indicates that the detected box has an IoU value lower than 50%. The false negative (FN) indicates a ground truth box that is not covered by any detection. Based on TP, FP, and FN, we define Precision (Prec), Recall (Rec), and F1-measure (F1). Precision reflects the correctness of a model in all detected boxes. It is defined as the ratio of the number of TPs to the number of all detected bounding boxes:
P r e c i s i o n = T P s T P s + F P s .
Recall reflects the ability of a model to cover all ground truth bounding boxes. It is defined as the ratio of the number of TPs to the number of all bounding boxes in ground truth:
P e c a l l = T P s T P s + F N s .
F1-measure is a combination of Precision and Recall, and it is defined as follows,
F 1 = 2 × P r e c i s i o n × R e c a l l ( P r e c i s i o n + P e c a l l ) .
The mean average precision (mAP) is defined as:
m A P = 1 C i = 1 C P r e c i s i o n ( i ) .
In Equation (8), C is the number of total disease classes, and P r e c i s i o n ( i ) (shown by (5)) stands for the precision of each disease class.

3.2. Training Details

We implemented YOLO-JD by PyTorch and trained it on a single GPU (Nvidia RTX 2080). The YOLO-JD model runs on a computer with an AMD 3700x CPU under the Ubuntu 18.04 operating system. We used a learning rate of 0.02 for the first 100 epochs and then 0.01 for the last 50 epochs with a mini-batch size of 8. We trained our network using Adam optimizer (SGD momentum rate at 0.937, weight decay rate at 0.005, epoch’s warm-up rate at 3.0, and warm-up initial momentum rate at 0.8).

3.3. Quantitative Results

To prove the effectiveness of YOLO-JD, we compare it with only detection models from the YOLO family. This is because nowadays the YOLO family holds the best image object detection performance in various applications. The contrasted models include YOLOv3 [12], YOLOv3(tiny) [12], YOLOv4 [13], YOLOv4(tiny) [14], YOLOv5-s [28], YOLOv5-m [28], YOLOv5-l [28], YOLOv5-x [28]. Except for YOLO-JD, we obtained pre-trained models on the COCO dataset [29] for all other methods compared, and we then conduct transferred learning on the jute dataset to speed up the convergence, respectively. Different from all others, our YOLO-JD was trained directly on the jute dataset. Table 2 reports the quantitative results on our jute disease test images. Our YOLO-JD achieved the best performance on all four metrics: Prec, Rec, F1, and mAP.

3.4. Qualitative Results

The qualitative comparison across YOLO-JD and other models is given in Figure 5. We choose one example from each disease category of the testing dataset. The first column of Figure 5 shows the input test images; the second column gives the ground truth images. The results of our YOLO-JD are given by the third column. The other eight state-of-the-art methods are shown in the fourth to the eleventh columns, respectively. Each row of Figure 5 stands for a type of disease, and the rows are arranged in the same order as in Table 1. For example, the first row of Figure 5 shows detection results of the stem rot disease of all models, in which YOLO-JD is the most similar to the ground truth bounding box. The 2nd row of Figure 5 shows detection results of another disease—anthracnose; YOLOv3, YOLO-v3(tiny), YOLOv5-m, and YOLOv5-l all have an extra false detection box, and our YOLO-JD detects the disease area with high accuracy. The 3rd row shows the results of detecting the disease of black band, our YOLO-JD avoids false positives and is almost identical to ground truth. On the fourth, fifth, sixth, seventh, and eighth rows, our method is still the closest to the ground truth across all models compared. The ninth and tenth rows show two test images of jute pests, respectively. YOLO-JD successfully detected the accurate pest areas, and our results are the closest to the ground truth across all models compared.
YOLO-JD has ability to locate multiple instances of a disease on the same image, and is also able to detect multiple classes of diseases and pests on the same image. Figure 6 shows that our YOLO-JD successfully detects multiple disease cases from three images in the testing dataset. Figure 6a contains 1 case of D-2 and 3 cases of P-1. Figure 6b contains 1 case of D-2 and 1 case of P-1. Figure 6c contains 1 case of D-6 and 1 case of P-1.

3.5. Ablation Analysis

To prove the independent contribution of each module to the total performance of the new modules added in YOLO-JD, we perform a simple but effective ablation analysis on the Jute disease dataset. The results of all ablation cases are shown in Table 3. In the “A1” version of our model, we replaced the SPPM module with the original Spatial Pyramid Pooling structure in the standard YOLOv5 while remaining all other parts unchanged. In “A2” version of our model, we replace the DSCFEM with the “C3” module in the original YOLOv5. In “A3” version of our model, we replace the SCFEM with the “BottleneckCSP” module in the original YOLOv5. The “C” version in Table 3 means the complete YOLO-JD model. We compared the complete YOLO-JD model with the “A1”, “A2”, and “A3” models using Precision, Recall, F1 on the same training scheme and the same dataset. The fully-deployed YOLO-JD has the best performance in the ablation comparison.

4. Discussion

In this discussion, we will highlight studies that used deep learning models to detect or recognize different crop diseases with high accuracies. Lee et al. [30] used a Convolution Neural Network (CNN) to process the plant images, and after removing the background and leaving only the potato leaves in the image to judge the symptoms of diseases for potato plants, achieved an accuracy of around 99%. On the other hand, Islam et al. [31] used transfer learning with VGG16 network to detect potato diseases from images with an accuracy of 99.43%. Likewise, Olivares et al. [32,33,34] used machine learning algorithms such as Random Forest to accurately identify soil properties associated with disease symptoms of tropical diseases of bananas. In our work, YOLO-JD achieved an average mAP of 96.63% for multiple diseases and pests for jute plants. Together with our YOLO-JD, the above works disclose the popularity and the broad application prospects of machine learning and deep learning on disease prevention for precision agriculture. Therefore, it is possible to effectively use the CNN-like or YOLO-like architectures to detect crop diseases from images, and provide highly accurate results. Though the size of the architectures and the number of parameters may vary from task to task, the creation of suitable models for various types of human-machine interfaces are possible and even straightforward.

5. Conclusions

In this paper, we present a new model YOLO-JD for detecting jute diseases and pests. The main contributions of this paper are threefold: (i) we built a new image dataset with accurate manual labels for jute diseases, and the dataset contains ten classes (eight in diseases, and two in pests); (ii) in this study, we integrated three new modules into the YOLO-JD architecture and achieved an average mAP at 96.63% and F1-score at 95.83% for all disease classes; and (iii) YOLO-JD outcompeted several other state-of-the-art methods from the YOLO family both qualitatively and quantitatively.
In the future, we will continue to optimize YOLO-JD for better performance. We are also going to update the jute disease dataset, and try to accommodate YOLO-JD to light-weight applications (such as apps for mobile devices).

Author Contributions

Conceptualization, D.L. and F.A.; methodology, D.L. and F.A.; software, F.A.; validation, N.W.; investigation, A.I.S.; data acquisition, F.A. and A.I.S.; writing—original draft preparation, F.A.; writing—review and editing, D.L., N.W. and F.A.; visualization, F.A.; supervision, D.L.; funding acquisition, D.L. and N.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Shanghai Rising-Star Program (No. 21QA1400100), Shanghai Natural Science Foundation (No. 20ZR1400800), and in part by the National Natural Science Foundation of China (No. 52101346).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data and code are available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mahapatra, B.S.; Mitra, S.; Ramasubramanian, T.; Sinha, M.K. Research on jute (Corchorus olitorius and C. capsularis) and kenaf (Hibiscus cannabinus and H. sabdariffa): Present status and future perspective. Indian J. Agric. Sci. 2009, 79, 951–967. [Google Scholar]
  2. Miah, M.J.; Khan, M.A.; Khan, R.A. Fabrication and Characterization of Jute Fiber Reinforced Low Density Polyethylene Based Composites: Effects of Chemical Treatment. J. Sci. Res. 2011, 3, 249–259. [Google Scholar] [CrossRef]
  3. CIEL; EIP; FracTracker Alliance; GAIA; 5Gyres; Breakfreefromplastic. Plastic & Climate: The Hidden Costs of a Plastic Planet; CIEL: Washington, DC, USA, 2019; pp. 1–108. Available online: https://www.ciel.org/wp-content/uploads/2019/05/Plastic-and-Climate-FINAL-2019.pdf (accessed on 26 March 2022).
  4. Barkoula, N.M.; Alcock, B.; Cabrera, N.O.; Peijs, T. Flame-Retardancy Properties of Intumescent Ammonium Poly(Phosphate) and Mineral Filler Magnesium Hydroxide in Combination with Graphene. Polym. Polym. Compos. 2008, 16, 101–113. [Google Scholar]
  5. Balasundram, S.K.; Golhani, K.; Shamshiri, R.R.; Vadamalai, G. Precision agriculture technologies for management of plant diseases. In Plant Disease Management Strategies for Sustainable Agriculture through Traditional and Modern Approaches; Springer: Cham, Switzerland, 2020; pp. 259–278. [Google Scholar]
  6. Traversari, S.; Cacini, S.; Galieni, A.; Nesi, B.; Nicastro, N.; Pane, C. Precision agriculture digital technologies for sustainable fungal disease management of ornamental plants. Sustainability 2021, 13, 3707. [Google Scholar] [CrossRef]
  7. Maniyath, S.R.; Vinod, P.V.; Niveditha, M.; Pooja, R.; Prasad Bhat, N.; Shashank, N.; Hebbar, R. Plant disease detection using machine learning. In Proceedings of the 2018 International Conference on Design Innovations for 3Cs Compute Communicate Control, ICDI3C 2018, Bangalore, India, 25–26 April 2018; pp. 41–45. [Google Scholar] [CrossRef]
  8. Gavhale, K.R.; Gawande, U.; Hajari, K.O. Unhealthy region of citrus leaf detection using image processing techniques. In Proceedings of the International Conference for Convergence for Technology—2014, Pune, India, 6–8 April 2014; pp. 2–7. [Google Scholar] [CrossRef]
  9. Hossain, M.S.; Mou, R.M.; Hasan, M.M.; Chakraborty, S.; Abdur Razzak, M. Recognition and detection of tea leaf’s diseases using support vector machine. In Proceedings of the 2018 IEEE 14th International Colloquium on Signal Processing & Its Applications (CSPA), Penang, Malaysia, 9–10 March 2018; pp. 150–154. [Google Scholar] [CrossRef]
  10. Jiao, L.; Zhang, F.; Liu, F.; Yang, S.; Li, L.; Feng, Z.; Qu, R. A survey of deep learning-based object detection. IEEE Access 2019, 7, 128837–128868. [Google Scholar] [CrossRef]
  11. Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant disease classification: A comparative evaluation of convolutional neural networks and deep learning optimizers. Plants 2020, 9, 1319. [Google Scholar] [CrossRef] [PubMed]
  12. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. 2018. Available online: http://arxiv.org/abs/1804.02767 (accessed on 8 April 2018).
  13. Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. 2020. Available online: http://arxiv.org/abs/2004.10934 (accessed on 23 April 2020).
  14. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 386–397. [Google Scholar] [CrossRef] [PubMed]
  16. Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single Shot Multibox Detector; Computer Vision-ECCV 2016. Lecture Notes in Computer Science (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics); Springer: Cham, Switzerland, 2016; Volume 9905, pp. 21–37. Available online: https://0-doi-org.brum.beds.ac.uk/10.1007/978-3-319-46448-0_2 (accessed on 29 December 2016). [CrossRef] [Green Version]
  17. Hammad Saleem, M.; Khanchi, S.; Potgieter, J.; Mahmood Arif, K. Image-based plant disease identification by deep learning meta-architectures. Plants 2020, 9, 1451. [Google Scholar] [CrossRef] [PubMed]
  18. Chowdhury, M.E.H.; Rahman, T.; Khandakar, A.; Ayari, M.A.; Khan, A.U.; Khan, M.S.; Al-Emadi, N.; Reaz, M.B.I.; Islam, M.T.; Ali, S.H.M. Automatic and Reliable Leaf Disease Detection Using Deep Learning Techniques. AgriEngineering 2021, 3, 294–312. [Google Scholar] [CrossRef]
  19. Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using deep learning for image-based plant disease detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Görlich, F.; Marks, E.; Mahlein, A.K.; König, K.; Lottes, P.; Stachniss, C. Uav-based classification of cercospora leaf spot using rgb images. Drones 2021, 5, 34. [Google Scholar] [CrossRef]
  21. Chen, Z.; Wu, R.; Lin, Y.; Li, C.; Chen, S.; Yuan, Z.; Chen, S.; Zou, X. Plant Disease Recognition Model Based on Improved YOLOv5. Agronomy 2022, 12, 365. [Google Scholar] [CrossRef]
  22. Arsenovic, M.; Karanovic, M.; Sladojevic, S.; Anderla, A.; Stefanovic, D. Solving current limitations of deep learning based approaches for plant disease detection. Symmetry 2019, 11, 939. [Google Scholar] [CrossRef] [Green Version]
  23. Vishnoi, V.K.; Kumar, K.; Kumar, B. Plant Disease Detection Using Computational Intelligence and Image Processing; Springer: Berlin/Heidelberg, Germany, 2021; Volume 128. [Google Scholar]
  24. Wagle, S.A.; Harikrishnan, R.; Ali, S.H.M.; Faseehuddin, M. Classification of plant leaves using new compact convolutional neural network models. Plants 2022, 11, 24. [Google Scholar] [CrossRef] [PubMed]
  25. De, R.K. Jute Diseases: Diagnosis and Management; ICAR-Center Research Institute for Jute and Allied Fibres (Indian Councial of Agricultural Research): Kolkata, India, 2019; ISBN 9789353822149. Available online: http://www.crijaf.org.in/ (accessed on 1 April 2019).
  26. Tzutalin. labelImg. 2015. Available online: https://github.com/tzutalin/labelImg (accessed on 27 July 2015).
  27. Zheng, Z.; Wang, P.; Ren, D.; Liu, W.; Ye, R.; Hu, Q.; Zuo, W. Enhancing Geometric Factors in Model Learning and Inference for Object Detection and Instance Segmentation. IEEE Trans. Cybern. 2021, 20, 1–13. [Google Scholar] [CrossRef] [PubMed]
  28. Ultralytics. YOLOv5. 2020. Available online: https://github.com/ultralytics/yolov5 (accessed on 25 June 2020).
  29. Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context; Computer Vision-ECCV 2014. Lecture Notes in Computer Science (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics); Springer: Cham, Swizerland, 2014; Volume 8693, pp. 740–755. [Google Scholar] [CrossRef] [Green Version]
  30. Lee, T.Y.; Yu, J.Y.; Chang, Y.C.; Yang, J.M. Health Detection for Potato Leaf with Convolutional Neural Network. In Proceedings of the 2020 Indo—Taiwan 2nd International Conference on Computing, Analytics and Networks (Indo-Taiwan ICAN), Rajpura, India, 7–15 February 2020; pp. 289–293. [Google Scholar] [CrossRef]
  31. Islam, F.; Hoq, M.N.; Rahman, C.M. Application of transfer learning to detect potato disease from leaf image. In Proceedings of the IEEE International Conference on Robotics, Automation, Artificial-Intelligence and Internet-of-Things (RAAICON), Dhaka, Bangladesh, 29 November–1 December 2019; pp. 127–130. [Google Scholar] [CrossRef]
  32. Olivares, B.O.; Rey, J.C.; Lobo, D.; Navas-Cortés, J.A.; Gómez, J.A.; Landa, B.B. Fusarium wilt of bananas: A review of agro-environmental factors in the venezuelan production system affecting its development. Agronomy 2021, 11, 986. [Google Scholar] [CrossRef]
  33. Olivares, B.O.; Paredes, F.; Rey, J.C.; Lobo, D.; Galvis-Causil, S. The relationship between the normalized difference vegetation index, rainfall, and potential evapotranspiration in a banana plantation of Venezuela. Soc. Psychol. Soc. 2021, 12, 58–64. [Google Scholar] [CrossRef]
  34. Olivares, B. Determination of the Potential Influence of Soil in the Differentiation of Productivity and in the Classification of Susceptible Areas to Banana wilt in Venezuela; UCOPress: Córdoba, Spain, 2022; pp. 89–111. [Google Scholar]
Figure 1. Demonstration of the full process of jute manufacturing. The production process comprises at least six steps, starting from harvesting to the binding of jute fiber.
Figure 1. Demonstration of the full process of jute manufacturing. The production process comprises at least six steps, starting from harvesting to the binding of jute fiber.
Plants 11 00937 g001
Figure 2. Some sample images from our jute diseases and pests dataset. (a) the stem rot disease. (b) the anthracnose disease. (c) the black band disease. (d) the soft rot disease. (e) the tip blight disease. (f) the dieback disease. (g) the jute mosaic disease. (h) the jute chlorosis disease. (i) the Cosmophila sabulifera caterpillar on a jute leaf. (j) some Hairy Caterpillars on a jute leaf.
Figure 2. Some sample images from our jute diseases and pests dataset. (a) the stem rot disease. (b) the anthracnose disease. (c) the black band disease. (d) the soft rot disease. (e) the tip blight disease. (f) the dieback disease. (g) the jute mosaic disease. (h) the jute chlorosis disease. (i) the Cosmophila sabulifera caterpillar on a jute leaf. (j) some Hairy Caterpillars on a jute leaf.
Plants 11 00937 g002
Figure 3. The overall architecture of YOLO-JD. The architecture contains three components—the head (backbone) component, the neck component, and the detection component. Structures of the three new modules: SPPM, SCFEM, and DSCFEM are detailed in Figure 4.
Figure 3. The overall architecture of YOLO-JD. The architecture contains three components—the head (backbone) component, the neck component, and the detection component. Structures of the three new modules: SPPM, SCFEM, and DSCFEM are detailed in Figure 4.
Plants 11 00937 g003
Figure 4. The detailed demonstration of several key modules in YOLO-JD. (a) Shows the architecture of the Sand Clock Feature Extraction Module (SCFEM). (b) Shows the details of the Deep Sand Clock Feature Extraction Module (DSCFEM). (c) Shows the Spatial Pyramid Pooling Module (SPPM).
Figure 4. The detailed demonstration of several key modules in YOLO-JD. (a) Shows the architecture of the Sand Clock Feature Extraction Module (SCFEM). (b) Shows the details of the Deep Sand Clock Feature Extraction Module (DSCFEM). (c) Shows the Spatial Pyramid Pooling Module (SPPM).
Plants 11 00937 g004
Figure 5. Qualitative comparison between our YOLO-JD and eight other models on Jute disease detection.
Figure 5. Qualitative comparison between our YOLO-JD and eight other models on Jute disease detection.
Plants 11 00937 g005
Figure 6. YOLO-JD detection on images that have multiple instances of the same disease and that have multiple classes of diseases and pests on the same image. (a,b) are two Jute stem images both contain Anthracnose disease and the Cosmophila sabulifera pest at the same time. (c) is an image contains the die back disease and the Cosmophila sabulifera pest at the same time.
Figure 6. YOLO-JD detection on images that have multiple instances of the same disease and that have multiple classes of diseases and pests on the same image. (a,b) are two Jute stem images both contain Anthracnose disease and the Cosmophila sabulifera pest at the same time. (c) is an image contains the die back disease and the Cosmophila sabulifera pest at the same time.
Plants 11 00937 g006
Table 1. The indices and causes of Jute diseases and pests.
Table 1. The indices and causes of Jute diseases and pests.
IndexName of Disease/PestCauseCausal Organism
D-1Stem rotFungalMacrophomina phaseolina (Tassi) Goid.
D-2AnthracnoseFungalColletotrichum corchorum Ikata and Tanaka; C. gloeosporioides (Penz.) Penz and Sacc.
D-3Black bandFungalBotryodiplodia theobromae (Pat.) Griff and Maubl.
D-4Soft rotFungalSclerotium rolfsii Sacc. (Athelia rolfsii)
D-5Tip blightFungalCurvularia subulata (Nees ex Fr.) Boedijn
D-6Die backFungalDiplodia corchori Syd. and P. Syd.
D-7Jute mosaicViralA Begomovirus of the Geminiviridae family, vector: Bemisia tabaci Genn. (Whitefly).
D-8Jute ChlorosisViralA member of Tobravirusgenus
P-1Cosmophila sabuliferaPest
P-2Hairy CaterpillarPest
Table 2. The quantitative comparison of several methods including YOLO-JD on the Jute disease test dataset. The best measures are in boldface.
Table 2. The quantitative comparison of several methods including YOLO-JD on the Jute disease test dataset. The best measures are in boldface.
MeasuresMethodsD-1D-2D-3D-4D-5D-6D-7D-8P-1P-2Mean
Prec (%)YOLOv373.4378.7178.5173.8183.2297.8198.3294.4080.1374.2383.26
YOLOv483.3381.9480.0385.7184.6497.7797.3191.0782.8976.1086.08
YOLOv3 Tiny69.2371.1368.0175.6088.8194.7298.0396.2979.0171.9481.27
YOLOv4 Tiny67.7482.8579.1681.2582.9897.4344.5492.5982.8977.7778.92
YOLOv5s83.8282.9171.2275.9289.4198.9196.2197.3184.4082.1286.22
YOLOv5m69.1365.3469.5474.5285.5597.8496.4498.8080.3370.3380.78
YOLOv5l69.5171.7573.1081.5288.6298.3496.1596.2285.7578.4183.93
YOLOv5x68.2466.6272.7178.2189.7098.8296.7198.1583.3178.1583.06
YOLO-JD (ours)98.3496.2195.8297.1095.3198.1098.6597.9091.6092.9096.19
Rec (%)YOLOv381.7383.9871.7485.4697.5392.7296.1192.7483.7686.7387.25
YOLOv485.6490.1768.9394.8787.6491.7694.1590.9275.3676.9885.64
YOLOv3 Tiny78.6487.5373.5191.2493.7493.8395.8189.8386.3689.3687.98
YOLOv4 Tiny74.6389.7477.9383.7678.5494.6297.3381.0288.3673.7283.96
YOLOv5s78.8384.6159.7082.9197.3395.9095.7191.2292.5182.9186.16
YOLOv5m84.8189.4374.1189.7298.7591.3392.2495.3596.2389.3090.12
YOLOv5l78.8485.9275.9391.3397.9591.8593.9189.2093.4083.6588.19
YOLOv5x78.1288.5573.9192.9497.7191.8991.7389.4193.6184.2188.20
YOLO-JD (ours)92.4198.6286.9293.2298.1096.7596.7196.2197.3195.2095.14
F1 (%)YOLOv377.1181.2574.9679.2089.7995.1996.9893.5681.8879.9784.99
YOLOv484.4685.8574.0582.6886.1194.6695.5490.9978.9476.5384.98
YOLOv3 Tiny73.6378.4870.6490.0591.2093.9197.4792.9482.5279.7085.05
YOLOv4 Tiny71.0186.1578.5482.4880.6996.0060.5386.4185.5375.6980.30
YOLOv5s81.2283.7464.9479.2493.1897.8595.5994.1588.2682.4986.06
YOLOv5m76.1475.4771.7281.1091.7594.4394.0597.6087.5378.6685.84
YOLOv5l73.8577.7474.4785.9893.0195.4094.0592.5689.0080.9185.69
YOLOv5x72.8176.0073.2984.5493.1495.4194.5793.5488.1581.0385.24
YOLO-JD (ours)95.2597.3892.5594.0496.1197.9496.8398.0195.2994.0395.74
mAP (%)YOLOv389.2388.4285.4396.1396.4197.7298.5395.7290.1084.7592.24
YOLOv486.1091.4483.3397.5596.5596.9197.5198.3394.7489.2593.17
YOLOv3 Tiny89.4288.5185.5296.5296.7298.5298.7295.9185.9285.9192.16
YOLOv4 Tiny78.5384.7182.5191.7190.3397.8876.2294.9092.7780.3386.98
YOLOv5s79.7187.3463.5796.3197.8596.4098.5193.2593.3581.5088.78
YOLOv5m79.8387.5568.6095.7298.4096.2198.7397.4192.5583.1489.82
YOLOv5l79.9486.7375.6196.5498.3196.3398.7593.9293.4184.3590.39
YOLOv5x80.2288.6172.2396.9198.8094.3598.9095.9594.4084.3190.47
YOLO-JD (ours)97.2196.1089.4094.2398.6198.5098.1098.7096.9097.3096.63
Table 3. YOLO-JD the results of the peeling test of the network on the Object Detection task. The best measures are in boldface. The “√” sign means deployment in network.
Table 3. YOLO-JD the results of the peeling test of the network on the Object Detection task. The best measures are in boldface. The “√” sign means deployment in network.
VerSCFEMDSCFEMSPPMD-1D-2D-3D-4D-5D-6D-7D-8P-1P-2Mean
Prec (%)A1 98.488.190.091.993.899.498.099.491.191.194.1
A2 91.685.387.092.492.199.196.299.189.789.792.2
A3 91.978.686.887.287.897.383.397.484.284.287.8
C98.396.298.097.195.399.598.399.791.691.696.5
Rec (%)A1 93.797.483.399.198.392.799.289.594.894.894.2
A2 89.497.179.698.799.391.799.189.093.493.493.1
A3 86.289.585.193.899.189.598.390.995.795.792.3
C92.498.686.999.399.598.199.596.299.399.396.9
F1 (%)A1 96.092.586.594.995.995.998.594.192.992.994.1
A2 90.590.883.195.495.595.297.693.791.591.592.4
A3 88.983.685.990.393.193.290.194.089.689.589.8
C95.297.492.198.197.398.798.897.995.395.296.6
mAP (%)A1 97.197.487.699.399.594.499.191.797.597.596.1
A2 93.697.887.099.199.398.798.698.797.597.196.7
A3 89.090.989.298.899.192.398.293.596.496.494.4
C97.298.189.499.799.698.599.398.799.499.497.9
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, D.; Ahmed, F.; Wu, N.; Sethi, A.I. YOLO-JD: A Deep Learning Network for Jute Diseases and Pests Detection from Images. Plants 2022, 11, 937. https://0-doi-org.brum.beds.ac.uk/10.3390/plants11070937

AMA Style

Li D, Ahmed F, Wu N, Sethi AI. YOLO-JD: A Deep Learning Network for Jute Diseases and Pests Detection from Images. Plants. 2022; 11(7):937. https://0-doi-org.brum.beds.ac.uk/10.3390/plants11070937

Chicago/Turabian Style

Li, Dawei, Foysal Ahmed, Nailong Wu, and Arlin I. Sethi. 2022. "YOLO-JD: A Deep Learning Network for Jute Diseases and Pests Detection from Images" Plants 11, no. 7: 937. https://0-doi-org.brum.beds.ac.uk/10.3390/plants11070937

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop