Transfer Learning Based Deep Neural Network for Detecting Artefacts in Endoscopic Images

Authors

  • Kirthika Natarajan School of Engineering, Avinashilingam Institute for Home Science and Higher Education for Women, Varapalayam, Coimbatore, Tamilnadu 641 108, India.
  • Dr. B. Sargunam School of Engineering, Avinashilingam Institute for Home Science and Higher Education for Women, Varapalayam, Coimbatore, Tamilnadu 641 108, India.

DOI:

https://doi.org/10.32985/ijeces.13.8.3

Keywords:

Deep Learning, Artefacts, Endoscopy, Transfer Learning

Abstract

Endoscopy is typically used to visualize various parts of the digestive tract. The technique is well suited to detect abnormalities like cancer/polyp, taking sample tissue called a biopsy, or cauterizing a bleeding vessel. During the procedure, video/ images are generated. It is affected by eight different artefacts: saturation, specularity, blood, blur, bubbles, contrast, instrument and miscellaneous artefacts like floating debris, chromatic aberration etc. The frames affected by artefacts are mostly discarded as the clinician could extract no valuable information from them. It affects post-processing steps. Based on the transfer learning approach, three state-of-the-art deep learning models, namely YOLOv3, YOLOv4 and Faster R-CNN, were trained with images from EAD public datasets and a custom dataset of endoscopic images of Indian patients annotated for artefacts mentioned above. The training set of images are data augmented and used to train all the three-artefact detectors. The predictions of the artefact detectors are combined to form an ensemble model whose results outperformed well compared to existing literature works by obtaining a mAP score of 0.561 and an IoU score of 0.682. The inference time of 80.4ms was recorded, which stands out best in the literature.

Downloads

Published

2022-10-25

How to Cite

[1]
K. Natarajan and S. Balusamy, “Transfer Learning Based Deep Neural Network for Detecting Artefacts in Endoscopic Images”, IJECES, vol. 13, no. 8, pp. 633-641, Oct. 2022.

Issue

Section

Original Scientific Papers