Effective Brain Tumor Classification Using Deep Residual Network-Based Transfer Learning

Authors

  • D. Saida Department of Computer Science & Engineering, University College of Engineering, Osmania University, Hyderabad, Telangana 500007, India https://orcid.org/0009-0007-8481-1036
  • KLSDT Keerthi Vardhan Department of Computer Science & Engineering, Siddhartha Institute of Engineering & Technology, Ranga Reddy-501510, Telangana, India
  • P. Premchand Department of Computer Science & Engineering, University College of Engineering, Osmania University, Hyderabad, Telangana 500007, India

DOI:

https://doi.org/10.32985/ijeces.14.6.2

Keywords:

Brain Tumor Segmentation, Convolutional Neural Network, Deep Residual Network, Magnetic Resonance Images, U-Net Architecture

Abstract

Brain tumor classification is an essential task in medical image processing that provides assistance to doctors for accurate diagnoses and treatment plans. A Deep Residual Network based Transfer Learning to a fully convoluted Convolutional Neural Network (CNN) is proposed to perform brain tumor classification of Magnetic Resonance Images (MRI) from the BRATS 2020 dataset. The dataset consists of a variety of pre-operative MRI scans to segment integrally varied brain tumors in appearance, shape, and histology, namely gliomas. A Deep Residual Network (ResNet-50) to a fully convoluted CNN is proposed to perform tumor classification from MRI of the BRATS dataset. The 50-layered residual network deeply convolutes the multi-category of tumor images in classification tasks using convolution block and identity block. Limitations such as Limited accuracy and complexity of algorithms in CNN-based ME-Net, and classification issues in YOLOv2 inceptions are resolved by the proposed model in this work. The trained CNN learns boundary and region tasks and extracts successful contextual information from MRI scans with minimal computation cost. The tumor segmentation and classification are performed in one step using a U-Net architecture, which helps retain spatial features of the image. The multimodality fusion is implemented to perform classification and regression tasks by integrating dataset information. The dice scores of the proposed model for Enhanced Tumor (ET), Whole Tumor (WT), and Tumor Core (TC) are 0.88, 0.97, and 0.90 on the BRATS 2020 dataset, and also resulted in 99.94% accuracy, 98.92% sensitivity, 98.63% specificity, and 99.94% precision.

Downloads

Published

2023-07-10

How to Cite

[1]
D. Saida, K. . Keerthi Vardhan, and P. . Premchand, “Effective Brain Tumor Classification Using Deep Residual Network-Based Transfer Learning”, IJECES, vol. 14, no. 6, pp. 625-634, Jul. 2023.

Issue

Section

Original Scientific Papers