[edit]
A LUPI distillation-based approach: Application to predicting Proximal Junctional Kyphosis
Proceedings of the 9th Machine Learning for Healthcare Conference, PMLR 252, 2024.
Abstract
We propose a learning algorithm called XGBoost+, a modified version of the extreme gradient boosting algorithm (XGBoost). The new algorithm utilizes privileged information (PI), data collected after inference time. XGBoost+ incorporates PI into a distillation framework for XGBoost. We also evaluate our proposed method on a real-world clinical dataset about Proximal Junctional Kyphosis (PJK). Our approach outperforms vanilla XGBoost, SVM, and SVM+ on various datasets. Our approach showcases the advantage of using privileged information to improve the performance of machine learning models in healthcare, where data after inference time can be leveraged to build better models.