FutureMorph: Toward Predicting Future Deformation Fields in Longitudinal Imaging

Samah Khawaled, Mert R. Sabuncu
Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, PMLR 315:1802-1820, 2026.

Abstract

Understanding how anatomy evolves over time is essential for tracking disease progression, quantifying risk, and studying healthy development and aging. Existing approaches either synthesize future images without modeling geometry or perform longitudinal registration that requires follow-up scans. We introduce FutureMorph, a framework that treats longitudinal forecasting as metadata-conditioned prediction of future diffeomorphic deformation fields. Given a baseline image (e.g., a brain MRI) and subject-level metadata (age, sex, and clinical variables), FutureMorph predicts time-indexed, subject-specific diffeomorphic deformation fields that explicitly capture future anatomical change. We employ a metadata-conditioned U-Net to estimate stationary velocity vector fields, which are integrated into smooth diffeomorphisms and applied using a spatial transformer to synthesize future images. Experiments on the OASIS-3 dataset show that our framework produces clinically meaningful predicted deformations and realistic future scans, capturing age- and interval-dependent trajectories. Our work provides a new perspective for longitudinal imaging studies by unifying image synthesis and deformation modeling.

Cite this Paper


BibTeX
@InProceedings{pmlr-v315-khawaled26a, title = {FutureMorph: Toward Predicting Future Deformation Fields in Longitudinal Imaging}, author = {Khawaled, Samah and Sabuncu, Mert R.}, booktitle = {Proceedings of The 9th International Conference on Medical Imaging with Deep Learning}, pages = {1802--1820}, year = {2026}, editor = {Huo, Yuankai and Gao, Mingchen and Kuo, Chang-Fu and Jin, Yueming and Deng, Ruining}, volume = {315}, series = {Proceedings of Machine Learning Research}, month = {08--10 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v315/main/assets/khawaled26a/khawaled26a.pdf}, url = {https://proceedings.mlr.press/v315/khawaled26a.html}, abstract = {Understanding how anatomy evolves over time is essential for tracking disease progression, quantifying risk, and studying healthy development and aging. Existing approaches either synthesize future images without modeling geometry or perform longitudinal registration that requires follow-up scans. We introduce FutureMorph, a framework that treats longitudinal forecasting as metadata-conditioned prediction of future diffeomorphic deformation fields. Given a baseline image (e.g., a brain MRI) and subject-level metadata (age, sex, and clinical variables), FutureMorph predicts time-indexed, subject-specific diffeomorphic deformation fields that explicitly capture future anatomical change. We employ a metadata-conditioned U-Net to estimate stationary velocity vector fields, which are integrated into smooth diffeomorphisms and applied using a spatial transformer to synthesize future images. Experiments on the OASIS-3 dataset show that our framework produces clinically meaningful predicted deformations and realistic future scans, capturing age- and interval-dependent trajectories. Our work provides a new perspective for longitudinal imaging studies by unifying image synthesis and deformation modeling.} }
Endnote
%0 Conference Paper %T FutureMorph: Toward Predicting Future Deformation Fields in Longitudinal Imaging %A Samah Khawaled %A Mert R. Sabuncu %B Proceedings of The 9th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2026 %E Yuankai Huo %E Mingchen Gao %E Chang-Fu Kuo %E Yueming Jin %E Ruining Deng %F pmlr-v315-khawaled26a %I PMLR %P 1802--1820 %U https://proceedings.mlr.press/v315/khawaled26a.html %V 315 %X Understanding how anatomy evolves over time is essential for tracking disease progression, quantifying risk, and studying healthy development and aging. Existing approaches either synthesize future images without modeling geometry or perform longitudinal registration that requires follow-up scans. We introduce FutureMorph, a framework that treats longitudinal forecasting as metadata-conditioned prediction of future diffeomorphic deformation fields. Given a baseline image (e.g., a brain MRI) and subject-level metadata (age, sex, and clinical variables), FutureMorph predicts time-indexed, subject-specific diffeomorphic deformation fields that explicitly capture future anatomical change. We employ a metadata-conditioned U-Net to estimate stationary velocity vector fields, which are integrated into smooth diffeomorphisms and applied using a spatial transformer to synthesize future images. Experiments on the OASIS-3 dataset show that our framework produces clinically meaningful predicted deformations and realistic future scans, capturing age- and interval-dependent trajectories. Our work provides a new perspective for longitudinal imaging studies by unifying image synthesis and deformation modeling.
APA
Khawaled, S. & Sabuncu, M.R.. (2026). FutureMorph: Toward Predicting Future Deformation Fields in Longitudinal Imaging. Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 315:1802-1820 Available from https://proceedings.mlr.press/v315/khawaled26a.html.

Related Material