Show simple item record

dc.contributor.authorPOLAT, Asli Nur
dc.contributor.authorMOHAMMED, Hussein Mahmood Abdo
dc.date.accessioned2026-04-21T12:28:19Z
dc.date.available2026-04-21T12:28:19Z
dc.date.issued2025
dc.identifier.issn2147-3129
dc.identifier.urihttp://dspace.beu.edu.tr:8080/xmlui/handle/123456789/16697
dc.description.abstractBreast cancer is the most common cancer among women and the most frequently diagnosed cancer worldwide. Recent advancements in deep learning have led to significant improvements in tumor detection from breast ultrasound (BUSI) images, enhancing the diagnostic accuracy of breast cancer screening. Although deep convolutional neural networks (CNNs) and transformerbased architectures have individually yielded promising results, challenges such as low contrast, spatial variability, and irregular tumor shapes continue to hinder the robustness of current methods. Therefore, in this study, a novel hybrid CNN–Transformer framework is proposed to improve discriminative feature extraction for BUSI cancer analysis. The network employs a dual-branch architecture, integrating features extracted from both CNN and transformer models. In the first branch, the Swin Transformer is combined with a Triplet Attention to strengthen its ability to learn long-range dependencies and global contextual information. The Triple Attention module processes feature maps along three orthogonal axes, enabling a more effective representation of both spatial and channel-level relationships. The second branch incorporates the Efficient Net architecture augmented with an Efficient Channel Attention (ECA) module, which facilitates adaptive channel-level feature recalibration. This design allows the model to emphasize diagnostically salient regions within ultrasound images. High-level features from both branches are fused for final classification. Experimental results on the BUSI dataset demonstrate that the proposed architecture achieves superior performance, with 97.4% accuracy, 97.9% precision, 97.9% sensitivity, and a 97.9% F1-score. These outcomes confirm the effectiveness of the proposed hybrid CNN–Transformer design in improving automated breast cancer diagnosis using ultrasound imaging.tr_TR
dc.language.isoEnglishtr_TR
dc.publisherBitlis Eren Üniversitesitr_TR
dc.rightsinfo:eu-repo/semantics/openAccesstr_TR
dc.subjectBreast cancer,tr_TR
dc.subjectUltrasound imaging,tr_TR
dc.subjectTransformer networks,tr_TR
dc.subjectCNNs,tr_TR
dc.subjectAttention mechanisms,tr_TR
dc.subjectHybrid deep learning.tr_TR
dc.titleIMPROVING BREAST CANCER DIAGNOSIS USING ATTENTION-ENHANCED HYBRID CNN–TRANSFORMER MODELtr_TR
dc.typeArticletr_TR
dc.identifier.issue4tr_TR
dc.identifier.startpage2616tr_TR
dc.identifier.endpage2638tr_TR
dc.relation.journalBİTLİS EREN ÜNİVERSİTESİ FEN BİLİMLERİ DERGİSİtr_TR
dc.identifier.volume14tr_TR
dc.contributor.departmentLisansüstü Eğitim Enstitüsütr_TR


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record