Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network

dc.contributorSistema FMUSP-HC: Faculdade de Medicina da Universidade de São Paulo (FMUSP) e Hospital das Clínicas da FMUSP
dc.contributor.authorPETRINI, Daniel G. P.
dc.contributor.authorSHIMIZU, Carlos
dc.contributor.authorROELA, Rosimeire A.
dc.contributor.authorVALENTE, Gabriel Vansuita
dc.contributor.authorFOLGUEIRA, Maria Aparecida Azevedo Koike
dc.contributor.authorKIM, Hae Yong
dc.date.accessioned2022-08-12T17:06:14Z
dc.date.available2022-08-12T17:06:14Z
dc.date.issued2022
dc.description.abstractSome recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a ""patch classifier"" that categorizes small subimages; the second uses the patch classifier to scan the whole mammogram and create the ""single-view whole-image classifier"". We propose to make a third transfer learning to obtain a ""two-view classifier"" to use the two mammographic views: bilateral craniocaudal and mediolateral oblique. We use EfficientNet as the basis of our model. We ""end-to-end"" train the entire system using CBIS-DDSM dataset. To ensure statistical robustness, we test our system twice using: (a) 5-fold cross validation; and (b) the original training/test division of the dataset. Our technique reached an AUC of 0.9344 using 5-fold cross validation (accuracy, sensitivity and specificity are 85.13% at the equal error rate point of ROC). Using the original dataset division, our technique achieved an AUC of 0.8483, as far as we know the highest reported AUC for this problem, although the subtle differences in the testing conditions of each work do not allow for an accurate comparison. The inference code and model are available at https://github.com/dpetrini/two-views-classifiereng
dc.description.indexPubMedeng
dc.description.sponsorshipNational Council for Scienti~c and Technological Development (CNPq) [305377/2018-3]
dc.identifier.citationIEEE ACCESS, v.10, p.77723-77731, 2022
dc.identifier.doi10.1109/ACCESS.2022.3193250
dc.identifier.issn2169-3536
dc.identifier.urihttps://observatorio.fm.usp.br/handle/OPI/48367
dc.language.isoeng
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INCeng
dc.relation.ispartofIeee Access
dc.rightsopenAccesseng
dc.rights.holderCopyright IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INCeng
dc.subjectMammographyeng
dc.subjectConvolutional neural networkseng
dc.subjectTrainingeng
dc.subjectTransfer learningeng
dc.subjectBreast cancereng
dc.subjectArtificial intelligenceeng
dc.subjectLesionseng
dc.subjectBreast cancer diagnosiseng
dc.subjectdeep learningeng
dc.subjectconvolutional neural networkeng
dc.subjectmammogrameng
dc.subjecttransfer learningeng
dc.subject.wosComputer Science, Information Systemseng
dc.subject.wosEngineering, Electrical & Electroniceng
dc.subject.wosTelecommunicationseng
dc.titleBreast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Networkeng
dc.typearticleeng
dc.type.categoryoriginal articleeng
dc.type.versionpublishedVersioneng
dspace.entity.typePublication
hcfmusp.author.externalPETRINI, Daniel G. P.:Univ Sao Paulo, Escola Politecn, BR-05508010 Sao Paulo, Brazil
hcfmusp.author.externalKIM, Hae Yong:Univ Sao Paulo, Escola Politecn, BR-05508010 Sao Paulo, Brazil
hcfmusp.citation.scopus18
hcfmusp.contributor.author-fmusphcCARLOS SHIMIZU
hcfmusp.contributor.author-fmusphcROSIMEIRE APARECIDA ROELA
hcfmusp.contributor.author-fmusphcGABRIEL VANSUITA VALENTE
hcfmusp.contributor.author-fmusphcMARIA APARECIDA AZEVEDO KOIKE FOLGUEIRA
hcfmusp.description.beginpage77723
hcfmusp.description.endpage77731
hcfmusp.description.volume10
hcfmusp.origemWOS
hcfmusp.origem.scopus2-s2.0-85135239069
hcfmusp.origem.wosWOS:000832955000001
hcfmusp.publisher.cityPISCATAWAYeng
hcfmusp.publisher.countryUSAeng
hcfmusp.relation.referenceAlsolami A. S., 2021, DATA, V6, P111eng
hcfmusp.relation.referenceBowyer K, 1996, INT CONGR SER, V1119, P431eng
hcfmusp.relation.referenceAlmeida RMD, 2021, PROCEEDINGS OF THE 23RD INTERNATIONAL CONFERENCE ON ENTERPRISE INFORMATION SYSTEMS (ICEIS 2021), VOL 1, P660, DOI 10.5220/0010440906600667eng
hcfmusp.relation.referenceGotmare A., 2018, ARXIVeng
hcfmusp.relation.referenceHANLEY JA, 1982, RADIOLOGY, V143, P29, DOI 10.1148/radiology.143.1.7063747eng
hcfmusp.relation.referenceHe KM, 2016, PROC CVPR IEEE, P770, DOI 10.1109/CVPR.2016.90eng
hcfmusp.relation.referenceKooi T, 2017, MED IMAGE ANAL, V35, P303, DOI 10.1016/j.media.2016.07.007eng
hcfmusp.relation.referenceKrizhevsky Alex, 2017, Communications of the ACM, V60, P84, DOI 10.1145/3065386eng
hcfmusp.relation.referenceLeCun Y, 1989, NEURAL COMPUT, V1, P541, DOI 10.1162/neco.1989.1.4.541eng
hcfmusp.relation.referenceLeCun Y., 2015, NATURE, V521, P436, DOI [DOI 10.1038/NATURE14539, 10.1038/nature14539]eng
hcfmusp.relation.referenceLee RS, 2017, SCI DATA, V4, DOI 10.1038/sdata.2017.177eng
hcfmusp.relation.referenceMcKinney SM, 2020, NATURE, V577, P89, DOI 10.1038/s41586-019-1799-6eng
hcfmusp.relation.referenceMoreira IC, 2012, ACAD RADIOL, V19, P236, DOI 10.1016/j.acra.2011.09.014eng
hcfmusp.relation.referencePanceri S. S., 2021, PROC INT JOINT C NEU, P1eng
hcfmusp.relation.referencePetrini D. G., 2021, CANCER RES, V81eng
hcfmusp.relation.referencePetrini DG, 2021, CANCER RES, V81eng
hcfmusp.relation.referencePham H. H., 2022, PHYSIONET, P1eng
hcfmusp.relation.referenceRodriguez-Ruiz A, 2019, JNCI-J NATL CANCER I, V111, P916, DOI 10.1093/jnci/djy222eng
hcfmusp.relation.referenceRussakovsky O, 2015, INT J COMPUT VISION, V115, P211, DOI 10.1007/s11263-015-0816-yeng
hcfmusp.relation.referenceSchaffter T, 2020, JAMA NETW OPEN, V3, DOI 10.1001/jamanetworkopen.2020.0265eng
hcfmusp.relation.referenceShen L., INCONSISTENT RESULTSeng
hcfmusp.relation.referenceShen L, 2019, SCI REP-UK, V9, DOI 10.1038/s41598-019-48995-4eng
hcfmusp.relation.referenceShu X, 2020, IEEE T MED IMAGING, V39, P2246, DOI 10.1109/TMI.2020.2968397eng
hcfmusp.relation.referenceSorkhei M., 2021, PROC NEURAL INF PROC, V1, P1eng
hcfmusp.relation.referenceTan MX, 2019, PROC CVPR IEEE, P2815, DOI 10.1109/CVPR.2019.00293eng
hcfmusp.relation.referenceTan MX, 2019, PR MACH LEARN RES, V97eng
hcfmusp.relation.referenceWei T., 2021, ARXIVeng
hcfmusp.relation.referenceWu N, 2020, IEEE T MED IMAGING, V39, P1184, DOI 10.1109/TMI.2019.2945514eng
hcfmusp.relation.referenceZhang XY, 2016, IEEE T PATTERN ANAL, V38, P1943, DOI 10.1109/TPAMI.2015.2502579eng
hcfmusp.scopus.lastupdate2024-05-10
relation.isAuthorOfPublication97bc2930-e615-479c-870e-9ce4eeda6230
relation.isAuthorOfPublicationdae4f6fe-d43e-4817-abae-9693cd1c9252
relation.isAuthorOfPublicationdb4d14fa-6448-42ab-b1e9-4af665935acd
relation.isAuthorOfPublication6d5113ff-467c-433f-9835-3a3f82a49cdc
relation.isAuthorOfPublication.latestForDiscovery97bc2930-e615-479c-870e-9ce4eeda6230
Arquivos
Pacote Original
Agora exibindo 1 - 1 de 1
Carregando...
Imagem de Miniatura
Nome:
art_PETRINI_Breast_Cancer_Diagnosis_in_TwoView_Mammography_Using_EndtoEnd_2022.PDF
Tamanho:
1.06 MB
Formato:
Adobe Portable Document Format
Descrição:
publishedVersion (English)