[edit]
Margins, Kernels and Non-linear Smoothed Perceptrons
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):244-252, 2014.
Abstract
We focus on the problem of finding a non-linear classification function that lies in a Reproducing Kernel Hilbert Space (RKHS) both from the primal point of view (finding a perfect separator when one exists) and the dual point of view (giving a certificate of non-existence), with special focus on generalizations of two classical schemes - the Perceptron (primal) and Von-Neumann (dual) algorithms. We cast our problem as one of maximizing the regularized normalized hard-margin (ρ) in an RKHS and use the Representer Theorem to rephrase it in terms of a Mahalanobis dot-product/semi-norm associated with the kernel’s (normalized and signed) Gram matrix. We derive an accelerated smoothed algorithm with a convergence rate of \tfrac\sqrt \log nρ given n separable points, which is strikingly similar to the classical kernelized Perceptron algorithm whose rate is \tfrac1ρ^2. When no such classifier exists, we prove a version of Gordan’s separation theorem for RKHSs, and give a reinterpretation of negative margins. This allows us to give guarantees for a primal-dual algorithm that halts in \min{\tfrac\sqrt n|ρ|, \tfrac\sqrt nε} iterations with a perfect separator in the RKHS if the primal is feasible or a dual ε-certificate of near-infeasibility.