Densely Convolutional Networks for Breast Cancer Classification with Multi-Modal Image Fusion
Eman Hamdy Maritime Transport Alex, Egyp This email address is being protected from spambots. You need JavaScript enabled to view it. |
Osama Badawy Arab Academy for Science, Technology and Maritime Transport Alex, Egypt This email address is being protected from spambots. You need JavaScript enabled to view it. |
|
Mohamed Zaghloul Egypt This email address is being protected from spambots. You need JavaScript enabled to view it. |
Abstract: Breast cancer is the main health burden worldwide. Cancer is located in the breast, starts when the cell grows under control and begins as in-situ carcinoma and when spread into other parts known as invasive carcinoma. Breast cancer mass can early be found by image modality when discovering mass early can easily diagnose and treated. Multimodalities used for the classification of breast cancer Such as mammography, ultrasound, and Magnetic resonance imaging. Two types of fusion are used earlier fusion and later fusion. Early fusion it’s a simple relation between modalities while later fusion gives more interest to fusion strategy to learn the complex relationship between various modalities as a result, can get highly accurate results when using the later fusion. When combining two image modalities (mammography, ultrasound) and using an excel sheet containing the age, view, side, and status attribute associated with each mammographic image using DenseNet 201 with Layer level fusion strategy as later fusion by making connections between the various paths and same path by using Concatenated layer. Fusing at the feature level achieves the best performance in terms of several evaluation metrics (accuracy, recall, precision area under the curve, and F1 score) and performance.
Keywords: Breast cancer/classification, deep learning, densenet, diagnostic imaging, multimodal imaging.
Received April 3, 2022; accepted April 28, 2022