Multimodal Breast Lesion Classification Using Cross-Attention Deep Networks
- Resource Type
- Conference
- Authors
- Vo, Hung Q.; Yuan, Pengyu; He, Tiancheng; Wong, Stephen T.C.; Nguyen, Hien V.
- Source
- 2021 IEEE EMBS International Conference on Biomedical and Health Informatics (BHI) Biomedical and Health Informatics (BHI), 2021 IEEE EMBS International Conference on. :1-4 Jul, 2021
- Subject
- Bioengineering
Computing and Processing
Signal Processing and Analysis
Systematics
Conferences
Estimation
Computer architecture
Breast
Feature extraction
Mammography
breast lesion
breast cancer
multimodal deep networks
attention deep networks
- Language
- ISSN
- 2641-3604
Accurate breast lesion risk estimation can significantly reduce unnecessary biopsies and help doctors decide optimal treatment plans. Most existing computer-aided systems rely solely on mammogram features to classify breast lesions. While this approach is convenient, it does not fully exploit useful information in clinical reports to achieve the optimal performance. Would clinical features significantly improve breast lesion classification compared to using mammograms alone? How to handle missing clinical information caused by variation in medical practice? What is the best way to combine mammograms and clinical features? There is a compelling need for a systematic study to address these fundamental questions. This paper investigates several multimodal deep networks based on feature concatenation, cross-attention, and co-attention to combine mammograms and categorical clinical variables. We show that the proposed architectures significantly increase the lesion classification performance (average area under ROC curves from 0.89 to 0.94). We also evaluate the model when clinical variables are missing.