[edit]
Heterogeneous Aligned Fusion for Survival Classification with Missing Modalities
Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, PMLR 315:2529-2546, 2026.
Abstract
Accurate survival classification is essential for guiding personalized treatment in head and neck cancer. Heterogeneous biomedical data, from histopathology to clinical and laboratory measurements, offer complementary prognostic value but differ in dimensionality, reside in incompatible feature spaces, and are frequently missing, making robust multimodal learning challenging. To address this, we propose HAF (Heterogeneous Aligned Fusion), a three-stage framework for survival classification under heterogeneous and incomplete multimodal inputs. HAF (i) uses detachment and prognostic supervision to obtain stable representations, (ii) performs lightweight global alignment that projects all modalities into a shared latent space while preserving patient-level discriminability, and (iii) enforces monotonic robust fusion that encourages performance to remain stable or improve when modalities are added. To the best of our knowledge, HAF is the first approach that jointly leverages all seven modalities in the HANCOCK cohort. Extensive comparisons against representative late-, early-, attention-based, and bilinear-interaction fusion methods demonstrate that HAF consistently improves both accuracy and robustness under heterogeneous and partially missing modalities.