IEEE Access (Jan 2023)

Plant Disease Classifier: Detection of Dual-Crop Diseases Using Lightweight 2D CNN Architecture

  • Hasibul Islam Peyal,
  • Md. Nahiduzzaman,
  • Md. Abu Hanif Pramanik,
  • Md. Khalid Syfullah,
  • Saleh Mohammed Shahriar,
  • Abida Sultana,
  • Mominul Ahsan,
  • Julfikar Haider,
  • Amith Khandakar,
  • Muhammad E. H. Chowdhury

DOI
https://doi.org/10.1109/ACCESS.2023.3320686
Journal volume & issue
Vol. 11
pp. 110627 – 110643

Abstract

Read online

Tomatoes are the most widely grown crop in the world, and they may be found in a variety of forms in every kitchen, regardless of cuisine. It is, after potato and sweet potato, the most widely farmed crop on the planet. Cotton is another essential cash crop because most farmers grow it in huge quantities. However, many diseases reduce the quality and quantity of tomato and cotton crops, resulting in a significant loss in production and productivity. It is critical to detect these disorders at an early stage of diagnosis. The purpose of this work is to categorize 14 classes for both cotton and tomato crops, with 12 diseased classes and two healthy classes using a deep learning-based lightweight 2D CNN architecture and to implement the model in an android application named “Plant Disease Classifier” for smartphone-assisted plant disease diagnosis system, the results of the experiments reveal that the proposed model outperforms the pre-trained models VGG16, VGG19 and InceptionV3 despite having fewer parameters. With slightly larger parameters than MobileNet and MobileNetV2, proposed model also attains considerably larger accuracy than these models. The classification accuracy varies between 57% and 92% for these models, and the proposed model’s average accuracy is 97.36%. Also, the precision, recall, F1-score of the proposed model is 97 % and Area Under Curve (AUC) score of the model is 99.9% which is an indicator of the very good performance of the model. Class activation maps were shown using the Gradient Weighted Class Activation Mapping (Grad-CAM) technique to visually explain the disease detected by the proposed model, and a heatmap was produced to indicate the responsible region for classification. The app works very impressively and classified the correct disease in a shorter period of time of about 4.84 ms due to the lightweight nature of the model.

Keywords