An end-to-end weakly supervised learning framework for cancer subtype classification using histopathological slides

Hongren Zhou, Hechang Chen*, Bo Yu, Shuchao Pang, Xianling Cong, Lele Cong

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

AI-powered analysis of histopathology data has become an invaluable assistant for pathologists due to its efficiency and accuracy. However, existing deep learning methods still face some challenges in specifying cancer subtypes. For example, the ultra-high resolution of histopathological slides generally contains numerous redundant features, which are not useful for cancer subtype classification and thus lead to considerable computational costs. Moreover, the lack of manual annotations of disease-specific regions (i.e., patch-level annotations) from experts makes it more difficult to learn such histological features with only slide-level labels. In this paper, we propose an end-to-end weakly supervised learning framework called EWSLF to address these issues. First, we employ a cluster-based sampling strategy to refine the histological features for further training, which can improve classification accuracy and reduce computational cost. Second, we employ a multi-branch attention mechanism to produce patch-level pseudo-labels and aggregate the patch features into slide-level features, which can complement the missing patch-level labels from experts. Experimental results on both public and in-house datasets demonstrate the superiority and credible results of our model compared with the state-of-the-art methods for cancer subtype classification. Code: https://github.com/hongren21/ewslf.

Original languageEnglish
Article number121379
Pages (from-to)1-12
Number of pages12
JournalExpert Systems with Applications
Volume237
DOIs
Publication statusPublished - 1 Mar 2024

Keywords

  • attention mechanism
  • histopathological data
  • interpretable diagnosis
  • subtype classification
  • weakly supervised

Cite this