IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)
Crops Leaf Disease Recognition From Digital and RS Imaging Using Fusion of Multi Self-Attention RBNet Deep Architectures and Modified Dragonfly Optimization
Abstract
Globally, pests and plant diseases severely threaten forestry and agriculture. Plant protection could be substantially enhanced by using noncontact, extremely effective, and reasonably priced techniques for identifying and tracking pests and plant diseases across large geographic areas. Precision agriculture is the study of using other technologies, such as hyperspectral remote sensing, to increase cultivation instead of traditional agricultural methods with less negative environmental effects. In this article, we proposed a novel deep-learning architecture and optimization algorithm for crop leaf disease recognition. In the initial step, a multilevel contrast enhancement technique is proposed for a better visual of the disease on the leaves of cotton and wheat. After that, we proposed three novel residual block and self-attention mechanisms, named 3-residual block-deep convolutional neural network (RBNet) Self, 5-RBNet Self, and 9-RBNet Self. After that, the proposed models are trained on enhanced images and later extracted deep features from the self-attention layer. The 5-RBNET Self and 9-RBNET Self performed well in terms of accuracy and precision rate; therefore, we did not consider the 3-RBNET Self for the next process. The dragonfly optimization algorithm is proposed for the best feature selection and applied to the self-attention features of 5-RBNET Self and 9-RBNET Self models to improve the classification performance further and reduce the computational cost. The proposed method is evaluated on two publically available crop disease images, such as the cotton, wheat, and EuroSAT datasets. For both crops, the proposed method obtained a maximum accuracy of 98.60% and 93.90%, respectively, whereas for the EuroSAT, the proposed method obtained an accuracy of 83.10%. Compared to the results with recent techniques, the proposed method shows improved accuracy and precision rate.
Keywords