Mostrar o rexistro simple do ítem

dc.contributor.authorNogueira Rodríguez, Alba
dc.contributor.authorDOMINGUEZ CARBAJALES, RUBEN 
dc.contributor.authorCampos-Tato, F.
dc.contributor.authorHerrero Rivas, Jesús Miguel
dc.contributor.authorPuga Gimenez de Azcárate, Manuel 
dc.contributor.authorRemedios Espino, David Rafael 
dc.contributor.authorRivas Moral, Laura 
dc.contributor.authorSánchez Hernández, Eloy 
dc.contributor.authorIglesias Gómez, Agueda
dc.contributor.authorCubiella Fernández, Joaquín 
dc.contributor.authorFernández Riverola, Florentino
dc.contributor.authorLopez-Fernandez, H.
dc.contributor.authorReboiro Jato, Miguel
dc.contributor.authorGonzález Pena, Daniel
dc.date.accessioned2024-01-02T10:05:07Z
dc.date.available2024-01-02T10:05:07Z
dc.date.issued2021
dc.identifier.issn0941-0643
dc.identifier.urihttp://hdl.handle.net/20.500.11940/18531
dc.description.abstractColorectal cancer is a major health problem, where advances towards computer-aided diagnosis (CAD) systems to assist the endoscopist can be a promising path to improvement. Here, a deep learning model for real-time polyp detection based on a pre-trained YOLOv3 (You Only Look Once) architecture and complemented with a post-processing step based on an object-tracking algorithm to reduce false positives is reported. The base YOLOv3 network was fine-tuned using a dataset composed of 28,576 images labelled with locations of 941 polyps that will be made public soon. In a frame-based evaluation using isolated images containing polyps, a general F-1 score of 0.88 was achieved (recall = 0.87, precision = 0.89), with lower predictive performance in flat polyps, but higher for sessile, and pedunculated morphologies, as well as with the usage of narrow band imaging, whereas polyp size < 5 mm does not seem to have significant impact. In a polyp-based evaluation using polyp and normal mucosa videos, with a positive criterion defined as the presence of at least one 50-frames-length (window size) segment with a ratio of 75% of frames with predicted bounding boxes (frames positivity), 72.61% of sensitivity (95% CI 68.99-75.95) and 83.04% of specificity (95% CI 76.70-87.92) were achieved (Youden = 0.55, diagnostic odds ratio (DOR) = 12.98). When the positive criterion is less stringent (window size = 25, frames positivity = 50%), sensitivity reaches around 90% (sensitivity = 89.91%, 95% CI 87.20-91.94; specificity = 54.97%, 95% CI 47.49-62.24; Youden = 0.45; DOR = 10.76). The object-tracking algorithm has demonstrated a significant improvement in specificity whereas maintaining sensitivity, as well as a marginal impact on computational performance. These results suggest that the model could be effectively integrated into a CAD system.
dc.language.isoen
dc.rightsAtribución 4.0 Internacional
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.titleReal-time polyp detection model using convolutional neural networks
dc.typeJournal Articlees
dc.authorsophosNogueira-Rodriguez, A.;Dominguez-Carbajales, R.;Campos-Tato, F.;Herrero, J.;Puga, M.;Remedios, D.;Rivas, L.;Sanchez, E.;Iglesias, A.;Cubiella, J.;Fdez-Riverola, F.;Lopez-Fernandez, H.;Reboiro-Jato, M.;Glez-Pena, D.
dc.identifier.doi10.1007/s00521-021-06496-4
dc.identifier.sophos46937
dc.issue.number0
dc.journal.titleNEURAL COMPUTING & APPLICATIONS
dc.organizationServizo Galego de Saúde::Áreas Sanitarias (A.S.)::Área Sanitaria de Ourense, Verín e O Barco de Valdeorras - Complexo Hospitalario Universitario de Ourense::Dixestivo||Servizo Galego de Saúde::Áreas Sanitarias (A.S.)::Área Sanitaria de Ourense, Verín e O Barco de Valdeorras - Complexo Hospitalario Universitario de Ourense::Informática
dc.relation.publisherversionhttps://link.springer.com/content/pdf/10.1007/s00521-021-06496-4.pdfes
dc.rights.accessRightsopenAccess
dc.subject.keywordCHUOes
dc.typefidesArtículo Científico (incluye Original, Original breve, Revisión Sistemática y Meta-análisis)es
dc.typesophosArtículo Originales
dc.volume.number0


Ficheiros no ítem

Este ítem aparece na(s) seguinte(s) colección(s)

Mostrar o rexistro simple do ítem

Atribución 4.0 Internacional
A non ser que se indique outra cousa, a licenza do ítem descríbese comoAtribución 4.0 Internacional