Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network
dc.contributor | Sistema FMUSP-HC: Faculdade de Medicina da Universidade de São Paulo (FMUSP) e Hospital das Clínicas da FMUSP | |
dc.contributor.author | PETRINI, Daniel G. P. | |
dc.contributor.author | SHIMIZU, Carlos | |
dc.contributor.author | ROELA, Rosimeire A. | |
dc.contributor.author | VALENTE, Gabriel Vansuita | |
dc.contributor.author | FOLGUEIRA, Maria Aparecida Azevedo Koike | |
dc.contributor.author | KIM, Hae Yong | |
dc.date.accessioned | 2022-08-12T17:06:14Z | |
dc.date.available | 2022-08-12T17:06:14Z | |
dc.date.issued | 2022 | |
dc.description.abstract | Some recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a ""patch classifier"" that categorizes small subimages; the second uses the patch classifier to scan the whole mammogram and create the ""single-view whole-image classifier"". We propose to make a third transfer learning to obtain a ""two-view classifier"" to use the two mammographic views: bilateral craniocaudal and mediolateral oblique. We use EfficientNet as the basis of our model. We ""end-to-end"" train the entire system using CBIS-DDSM dataset. To ensure statistical robustness, we test our system twice using: (a) 5-fold cross validation; and (b) the original training/test division of the dataset. Our technique reached an AUC of 0.9344 using 5-fold cross validation (accuracy, sensitivity and specificity are 85.13% at the equal error rate point of ROC). Using the original dataset division, our technique achieved an AUC of 0.8483, as far as we know the highest reported AUC for this problem, although the subtle differences in the testing conditions of each work do not allow for an accurate comparison. The inference code and model are available at https://github.com/dpetrini/two-views-classifier | eng |
dc.description.index | PubMed | eng |
dc.description.sponsorship | National Council for Scienti~c and Technological Development (CNPq) [305377/2018-3] | |
dc.identifier.citation | IEEE ACCESS, v.10, p.77723-77731, 2022 | |
dc.identifier.doi | 10.1109/ACCESS.2022.3193250 | |
dc.identifier.issn | 2169-3536 | |
dc.identifier.uri | https://observatorio.fm.usp.br/handle/OPI/48367 | |
dc.language.iso | eng | |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | eng |
dc.relation.ispartof | Ieee Access | |
dc.rights | openAccess | eng |
dc.rights.holder | Copyright IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | eng |
dc.subject | Mammography | eng |
dc.subject | Convolutional neural networks | eng |
dc.subject | Training | eng |
dc.subject | Transfer learning | eng |
dc.subject | Breast cancer | eng |
dc.subject | Artificial intelligence | eng |
dc.subject | Lesions | eng |
dc.subject | Breast cancer diagnosis | eng |
dc.subject | deep learning | eng |
dc.subject | convolutional neural network | eng |
dc.subject | mammogram | eng |
dc.subject | transfer learning | eng |
dc.subject.wos | Computer Science, Information Systems | eng |
dc.subject.wos | Engineering, Electrical & Electronic | eng |
dc.subject.wos | Telecommunications | eng |
dc.title | Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network | eng |
dc.type | article | eng |
dc.type.category | original article | eng |
dc.type.version | publishedVersion | eng |
dspace.entity.type | Publication | |
hcfmusp.author.external | PETRINI, Daniel G. P.:Univ Sao Paulo, Escola Politecn, BR-05508010 Sao Paulo, Brazil | |
hcfmusp.author.external | KIM, Hae Yong:Univ Sao Paulo, Escola Politecn, BR-05508010 Sao Paulo, Brazil | |
hcfmusp.citation.scopus | 18 | |
hcfmusp.contributor.author-fmusphc | CARLOS SHIMIZU | |
hcfmusp.contributor.author-fmusphc | ROSIMEIRE APARECIDA ROELA | |
hcfmusp.contributor.author-fmusphc | GABRIEL VANSUITA VALENTE | |
hcfmusp.contributor.author-fmusphc | MARIA APARECIDA AZEVEDO KOIKE FOLGUEIRA | |
hcfmusp.description.beginpage | 77723 | |
hcfmusp.description.endpage | 77731 | |
hcfmusp.description.volume | 10 | |
hcfmusp.origem | WOS | |
hcfmusp.origem.scopus | 2-s2.0-85135239069 | |
hcfmusp.origem.wos | WOS:000832955000001 | |
hcfmusp.publisher.city | PISCATAWAY | eng |
hcfmusp.publisher.country | USA | eng |
hcfmusp.relation.reference | Alsolami A. S., 2021, DATA, V6, P111 | eng |
hcfmusp.relation.reference | Bowyer K, 1996, INT CONGR SER, V1119, P431 | eng |
hcfmusp.relation.reference | Almeida RMD, 2021, PROCEEDINGS OF THE 23RD INTERNATIONAL CONFERENCE ON ENTERPRISE INFORMATION SYSTEMS (ICEIS 2021), VOL 1, P660, DOI 10.5220/0010440906600667 | eng |
hcfmusp.relation.reference | Gotmare A., 2018, ARXIV | eng |
hcfmusp.relation.reference | HANLEY JA, 1982, RADIOLOGY, V143, P29, DOI 10.1148/radiology.143.1.7063747 | eng |
hcfmusp.relation.reference | He KM, 2016, PROC CVPR IEEE, P770, DOI 10.1109/CVPR.2016.90 | eng |
hcfmusp.relation.reference | Kooi T, 2017, MED IMAGE ANAL, V35, P303, DOI 10.1016/j.media.2016.07.007 | eng |
hcfmusp.relation.reference | Krizhevsky Alex, 2017, Communications of the ACM, V60, P84, DOI 10.1145/3065386 | eng |
hcfmusp.relation.reference | LeCun Y, 1989, NEURAL COMPUT, V1, P541, DOI 10.1162/neco.1989.1.4.541 | eng |
hcfmusp.relation.reference | LeCun Y., 2015, NATURE, V521, P436, DOI [DOI 10.1038/NATURE14539, 10.1038/nature14539] | eng |
hcfmusp.relation.reference | Lee RS, 2017, SCI DATA, V4, DOI 10.1038/sdata.2017.177 | eng |
hcfmusp.relation.reference | McKinney SM, 2020, NATURE, V577, P89, DOI 10.1038/s41586-019-1799-6 | eng |
hcfmusp.relation.reference | Moreira IC, 2012, ACAD RADIOL, V19, P236, DOI 10.1016/j.acra.2011.09.014 | eng |
hcfmusp.relation.reference | Panceri S. S., 2021, PROC INT JOINT C NEU, P1 | eng |
hcfmusp.relation.reference | Petrini D. G., 2021, CANCER RES, V81 | eng |
hcfmusp.relation.reference | Petrini DG, 2021, CANCER RES, V81 | eng |
hcfmusp.relation.reference | Pham H. H., 2022, PHYSIONET, P1 | eng |
hcfmusp.relation.reference | Rodriguez-Ruiz A, 2019, JNCI-J NATL CANCER I, V111, P916, DOI 10.1093/jnci/djy222 | eng |
hcfmusp.relation.reference | Russakovsky O, 2015, INT J COMPUT VISION, V115, P211, DOI 10.1007/s11263-015-0816-y | eng |
hcfmusp.relation.reference | Schaffter T, 2020, JAMA NETW OPEN, V3, DOI 10.1001/jamanetworkopen.2020.0265 | eng |
hcfmusp.relation.reference | Shen L., INCONSISTENT RESULTS | eng |
hcfmusp.relation.reference | Shen L, 2019, SCI REP-UK, V9, DOI 10.1038/s41598-019-48995-4 | eng |
hcfmusp.relation.reference | Shu X, 2020, IEEE T MED IMAGING, V39, P2246, DOI 10.1109/TMI.2020.2968397 | eng |
hcfmusp.relation.reference | Sorkhei M., 2021, PROC NEURAL INF PROC, V1, P1 | eng |
hcfmusp.relation.reference | Tan MX, 2019, PROC CVPR IEEE, P2815, DOI 10.1109/CVPR.2019.00293 | eng |
hcfmusp.relation.reference | Tan MX, 2019, PR MACH LEARN RES, V97 | eng |
hcfmusp.relation.reference | Wei T., 2021, ARXIV | eng |
hcfmusp.relation.reference | Wu N, 2020, IEEE T MED IMAGING, V39, P1184, DOI 10.1109/TMI.2019.2945514 | eng |
hcfmusp.relation.reference | Zhang XY, 2016, IEEE T PATTERN ANAL, V38, P1943, DOI 10.1109/TPAMI.2015.2502579 | eng |
hcfmusp.scopus.lastupdate | 2024-05-10 | |
relation.isAuthorOfPublication | 97bc2930-e615-479c-870e-9ce4eeda6230 | |
relation.isAuthorOfPublication | dae4f6fe-d43e-4817-abae-9693cd1c9252 | |
relation.isAuthorOfPublication | db4d14fa-6448-42ab-b1e9-4af665935acd | |
relation.isAuthorOfPublication | 6d5113ff-467c-433f-9835-3a3f82a49cdc | |
relation.isAuthorOfPublication.latestForDiscovery | 97bc2930-e615-479c-870e-9ce4eeda6230 |
Arquivos
Pacote Original
1 - 1 de 1
Carregando...
- Nome:
- art_PETRINI_Breast_Cancer_Diagnosis_in_TwoView_Mammography_Using_EndtoEnd_2022.PDF
- Tamanho:
- 1.06 MB
- Formato:
- Adobe Portable Document Format
- Descrição:
- publishedVersion (English)