IEEE Access (Jan 2023)
Multi-Orientation Local Texture Features for Guided Attention-Based Fusion in Lung Nodule Classification
Abstract
Computerized tomography (CT) scan images are widely used in automatic lung cancer detection and classification. The lung nodules’ texture distribution throughout the CT scan volume can vary significantly, and accurate identification and consideration of discriminative information in this volume can greatly help the classification process. Deep stacks of recurrent and convolutional operations cannot entirely represent such variations, especially in the size and location of the nodules. To model this complex pattern of inter/intra dependencies in the CT slices of each nodule, a multi-orientation-based guided-attention module (MOGAM) is proposed in this paper, which provides high flexibility in concentrating on the relevant information extracted from different regions of the nodule in a non-local manner. Moreover, to provide the model with finer-grained discriminative information from the nodule volume, specifically-designed local texture feature descriptors (TFDs) are extracted from the nodule slices in multiple orientations. These TFDs not only represent the distribution of textural information across multiple slices of a nodule but also encode and approximate this distribution within each slice. The extended experimentation has shown the effectiveness of the non-local combination of these local TFDs through the proposed guided attention mechanism. According to the classification results obtained on the standard LIDC-IDRI dataset, the proposed approach has outperformed other counterparts in terms of accuracy and AUC evaluation metrics. Also, a detailed explainability analysis of the results is provided, demonstrating the correct functioning of the proposed attention-based fusion approach, which is required by medical experts.
Keywords