Interstitial Lung Disease (ILD) refers to pulmonary disorders that affect the lung parenchyma through inflammation and fibrosis. It is possible to diagnose ILD visually with computed tomography (CT), but it is highly demanding. Machine learning (ML) has yielded powerful models, such as convolutional neural networks (CNN), that achieve state-of-the-art performance in image classification. However, even with advances in CNN explainability, an expert is often required to justify its decisions adequately. Radiomic features are more reada ble for medical analysis because they can be related to image characteristics and are intuitively used by radiologists. There is potential in using image data via CNN and radiomic features to classify lung CT images. In this work, we develop two ML models: a CNN for classifying ILD using CT scans; and a Multi-Layer Perceptron (MLP) for classifying healthy and ILD using radiomic features. In the ensemble approach, output weights of each model are combined, providing a robust method capable of classifying ILD with the CT and the radiomic features. From a high-resolution CT dataset with 32 x 32 patches of pathological lung and healthy tissues, we extract 92 radiomic features, excluding those above 90% Pearson correlation in the training sets of both cross-validation and final models. Using 0.6 for the MLP and 0.4 for the CNN as weights, our approach achieves an accuracy of 0.874, while the MLP achieved 0.870 and, the CNN.
|