Minimax rates for costsensitive learning on manifolds with approximate nearest neighbours
[edit]
Proceedings of the 28th International Conference on Algorithmic Learning Theory, PMLR 76:1156, 2017.
Abstract
We study the approximate nearest neighbour method for costsensitive classification on lowdimensional manifolds embedded within a highdimensional feature space. We determine the minimax learning rates for distributions on a smooth manifold, in a costsensitive setting. This generalises a classic result of Audibert and Tsybakov. Building upon recent work of Chaudhuri and Dasgupta we prove that these minimax rates are attained by the approximate nearest neighbour algorithm, where neighbours are computed in a randomly projected lowdimensional space. In addition, we give a bound on the number of dimensions required for the projection which depends solely upon the reach and dimension of the manifold, combined with the regularity of the marginal.
Related Material


