[edit]
From Basic to Extra Features: Hypergraph Transformer Pretrain-then-Finetuning for Balanced Clinical Predictions on EHR
Proceedings of the fifth Conference on Health, Inference, and Learning, PMLR 248:182-197, 2024.
Abstract
Electronic Health Records (EHRs) contain rich patient information and are crucial for clinical research and practice. In recent years, deep learning models have been applied to EHRs, but they often rely on massive features, which may not be readily available for all patients. We propose \ours{}\footnote{Short for \textbf{H}ypergraph \textbf{T}ransformer \textbf{P}retrain-then-Finetuning with \textbf{S}moo\textbf{t}hness-induced regularization \textbf{a}nd \textbf{R}eweighting.}, which leverages hypergraph structures with a pretrain-then-finetune framework for modeling EHR data, enabling seamless integration of additional features. Additionally, we design two techniques, namely (1) \emph{Smoothness-inducing Regularization} and (2) \emph{Group-balanced Reweighting}, to enhance the model’s robustness during finetuning. Through experiments conducted on two real EHR datasets, we demonstrate that \ours{} consistently outperforms va