Worldwide, breast cancer presents a significant health challenge, necessitating innovative techniques for early detection and prognosis. Although mammography is the established screening method, it has drawbacks, including radiation exposure and high costs. Recent studies have explored the application of machine learning to frontal infrared images for breast cancer detection. However, the potential of infrared imaging from angular views has not been thoroughly explored. In this paper, we investigate, develop, and evaluate classification models for breast cancer diagnosis using lateral and oblique infrared images. Our approach incorporates radiomic features and convolutional neural networks along with various feature fusion techniques to train deep neural networks. The primary objective is to determine the suitability of angular views for breast cancer detection, identify the most effective view, and assess its impact on classification accuracy. Utilizing the publicly available Database for Mastology Research with Infrared Images (DMR-IR), we apply an image processing pipeline for image improvement and segmentation. Additionally, we extract features using two strategies: radiomic features and convolutional neural network features. Subsequently, we conduct a series of k-fold cross-validation experiments to determine whether the features and feature fusion techniques are effective. Our findings indicate that oblique images, particularly when combined with DenseNet features, demonstrate superior performance. We achieved an average accuracy of 97.74%, specificity of 95.25%, and an F1 score of 98.24%. This study contributes to the advancement of machine learning in early breast cancer detection and underscores the significant potential of angular views in thermal infrared imaging, leading to improved diagnostic outcomes for patients worldwide.
|