IEEE Access (Jan 2024)

GrapeLeafNet: A Dual-Track Feature Fusion Network With Inception-ResNet and Shuffle-Transformer for Accurate Grape Leaf Disease Identification

  • R. Karthik,
  • R. Menaka,
  • S. Ompirakash,
  • P. Bala Murugan,
  • M. Meenakashi,
  • Sindhia Lingaswamy,
  • Daehan Won

DOI
https://doi.org/10.1109/ACCESS.2024.3361044
Journal volume & issue
Vol. 12
pp. 19612 – 19624

Abstract

Read online

Grapes are a widely cultivated crop in the horticultural industry, renowned for their unique flavor and nutritional benefits. However, this crop is highly susceptible to various diseases that can cause significant reductions in yield and quality, resulting in considerable financial losses. Therefore, it is imperative to identify these diseases to effectively manage their spread. Traditionally, the identification of grape leaf diseases has relied on scientific expertise and observational skills. However, with the advent of deep learning methods, it is now feasible to recognize disease patterns from images of infected leaves. In this research, we propose a novel dual-track feature fusion network titled ‘GrapeLeafNet’ for detecting grape leaf disease. It employs a dual-track feature fusion approach, combining Inception-ResNet blocks with CBAM for local feature extraction and Shuffle-Transformer for global feature extraction. The first track uses Inception-ResNet blocks to represent features at multiple scales and map significant features, and CBAM captures significant spatial and channel dependencies. The second track employs Shuffle-Transformer to extract long-term dependencies and complex global features in images. The extracted features are then fused using Coordinate attention, enabling the network to capture both local and global contextual information. Experimental results on the Grape leaf disease dataset from Plant Village demonstrate the effectiveness of the proposed network, achieving an accuracy of 99.56%.

Keywords