[edit]
Mediated Uncoupled Learning: Learning Functions without Direct Input-output Correspondences
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:11637-11647, 2021.
Abstract
Ordinary supervised learning is useful when we have paired training data of input X and output Y. However, such paired data can be difficult to collect in practice. In this paper, we consider the task of predicting Y from X when we have no paired data of them, but we have two separate, independent datasets of X and Y each observed with some mediating variable U, that is, we have two datasets SX={(Xi,Ui)} and SY={(U′j,Y′j)}. A naive approach is to predict U from X using SX and then Y from U using SY, but we show that this is not statistically consistent. Moreover, predicting U can be more difficult than predicting Y in practice, e.g., when U has higher dimensionality. To circumvent the difficulty, we propose a new method that avoids predicting U but directly learns Y=f(X) by training f(X) with SX to predict h(U) which is trained with SY to approximate Y. We prove statistical consistency and error bounds of our method and experimentally confirm its practical usefulness.