ResGAT: A Residual Graph Attention Network for Cancer Subtype Classification in Whole Slide Images

Zhenhan Lin, Hao Tong, Yunfei Hu, Xianyong Gui, Jeanne Shen, Byrne Lee, Lu Zhang, Daniel Moyer, Mu Zhou, Xin Maizie Zhou, Konstantinos Votanopoulos
Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, PMLR 315:3911-3930, 2026.

Abstract

Multiple instance learning (MIL) provides a weakly supervised framework for whole slide image (WSI) classification, enabling slide-level prediction from gigapixel images with only slide-level labels. However, WSI subtype classification in realistic settings is still challenging. In this work, we propose ResGAT, a residual graph attention framework that operates on hybrid $k$-NN patch graphs and models WSI representations with stacked residual graph attention blocks. ResGAT is evaluated on the subtype classification task across a rare, class-imbalanced appendiceal cancer cohort, BRACS and two TCGA datasets. It outperforms SOTA MIL baselines on the appendiceal cancer and BRACS cohorts, and remains competitive on the TCGA datasets. On the appendiceal cancer cohort, we further assess cross-site generalization via few-shot adaptation under source shift, showing that ResGAT adapts effectively to new domains with limited labels. An ablation study is provided to validate the effectiveness of key architectural components of our method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v315-lin26c, title = {ResGAT: A Residual Graph Attention Network for Cancer Subtype Classification in Whole Slide Images}, author = {Lin, Zhenhan and Tong, Hao and Hu, Yunfei and Gui, Xianyong and Shen, Jeanne and Lee, Byrne and Zhang, Lu and Moyer, Daniel and Zhou, Mu and Zhou, Xin Maizie and Votanopoulos, Konstantinos}, booktitle = {Proceedings of The 9th International Conference on Medical Imaging with Deep Learning}, pages = {3911--3930}, year = {2026}, editor = {Huo, Yuankai and Gao, Mingchen and Kuo, Chang-Fu and Jin, Yueming and Deng, Ruining}, volume = {315}, series = {Proceedings of Machine Learning Research}, month = {08--10 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v315/main/assets/lin26c/lin26c.pdf}, url = {https://proceedings.mlr.press/v315/lin26c.html}, abstract = {Multiple instance learning (MIL) provides a weakly supervised framework for whole slide image (WSI) classification, enabling slide-level prediction from gigapixel images with only slide-level labels. However, WSI subtype classification in realistic settings is still challenging. In this work, we propose ResGAT, a residual graph attention framework that operates on hybrid $k$-NN patch graphs and models WSI representations with stacked residual graph attention blocks. ResGAT is evaluated on the subtype classification task across a rare, class-imbalanced appendiceal cancer cohort, BRACS and two TCGA datasets. It outperforms SOTA MIL baselines on the appendiceal cancer and BRACS cohorts, and remains competitive on the TCGA datasets. On the appendiceal cancer cohort, we further assess cross-site generalization via few-shot adaptation under source shift, showing that ResGAT adapts effectively to new domains with limited labels. An ablation study is provided to validate the effectiveness of key architectural components of our method.} }
Endnote
%0 Conference Paper %T ResGAT: A Residual Graph Attention Network for Cancer Subtype Classification in Whole Slide Images %A Zhenhan Lin %A Hao Tong %A Yunfei Hu %A Xianyong Gui %A Jeanne Shen %A Byrne Lee %A Lu Zhang %A Daniel Moyer %A Mu Zhou %A Xin Maizie Zhou %A Konstantinos Votanopoulos %B Proceedings of The 9th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2026 %E Yuankai Huo %E Mingchen Gao %E Chang-Fu Kuo %E Yueming Jin %E Ruining Deng %F pmlr-v315-lin26c %I PMLR %P 3911--3930 %U https://proceedings.mlr.press/v315/lin26c.html %V 315 %X Multiple instance learning (MIL) provides a weakly supervised framework for whole slide image (WSI) classification, enabling slide-level prediction from gigapixel images with only slide-level labels. However, WSI subtype classification in realistic settings is still challenging. In this work, we propose ResGAT, a residual graph attention framework that operates on hybrid $k$-NN patch graphs and models WSI representations with stacked residual graph attention blocks. ResGAT is evaluated on the subtype classification task across a rare, class-imbalanced appendiceal cancer cohort, BRACS and two TCGA datasets. It outperforms SOTA MIL baselines on the appendiceal cancer and BRACS cohorts, and remains competitive on the TCGA datasets. On the appendiceal cancer cohort, we further assess cross-site generalization via few-shot adaptation under source shift, showing that ResGAT adapts effectively to new domains with limited labels. An ablation study is provided to validate the effectiveness of key architectural components of our method.
APA
Lin, Z., Tong, H., Hu, Y., Gui, X., Shen, J., Lee, B., Zhang, L., Moyer, D., Zhou, M., Zhou, X.M. & Votanopoulos, K.. (2026). ResGAT: A Residual Graph Attention Network for Cancer Subtype Classification in Whole Slide Images. Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 315:3911-3930 Available from https://proceedings.mlr.press/v315/lin26c.html.

Related Material